You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by "Squires, Tom (ELS-LON)" <to...@elsevier.com> on 2016/08/18 09:11:46 UTC
Unable to create secondary index with IndexTool -
"java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;"
Hi,
I am trying to use the org.apache.phoenix.mapreduce.index.IndexTool to create a secondary index on a table in our Phoenix cluster. We are using HBase 1.2 on Cloudera CDH 5.7.2.
I downloaded the Phoenix 4.8.0 for HBase 1.2 binaries so that we had a version of IndexTool that is compatible with our HBase version, but I am getting the following error when running the tool:
java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;
This looks to me like the IndexTool is expecting a different version of HBase.
Can anyone please advise? I have pasted shell output below.
Many thanks,
Tom
1.
[ec2-user@ip-10-0-0-229 ~]$ hbase version
2.
HBase 1.2.0-cdh5.7.2
3.
Source code repository file:///data/jenkins/workspace/generic-package-centos64-7-0/topdir/BUILD/hbase-1.2.0-cdh5.7.2 revision=Unknown
4.
Compiled by jenkins on Fri Jul 22 12:21:17 PDT 2016
5.
From source with checksum bc88ae0a54f047ea2506e04326e55353
6.
[ec2-user@ip-10-0-0-229 ~]$ hbase org.apache.phoenix.mapreduce.index.IndexTool --schema MENDELEY --data-table DOCUMENTS --index-table INDEX_PROFILE_ID --output-path /home/user/hadoop
7.
Error: Could not find or load main class org.apache.phoenix.mapreduce.index.IndexTool
8.
[ec2-user@ip-10-0-0-229 ~]$ find apache-phoenix-4.8.0-HBase-1.2-bin/ -name "*.jar" | xargs grep IndexTool.class
9.
Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-server.jar matches
10.
Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-core-4.8.0-HBase-1.2.jar matches
11.
Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-client.jar matches
12.
Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-hive.jar matches
13.
[ec2-user@ip-10-0-0-229 ~]$ sudo cp apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-server.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-core-4.8.0-HBase-1.2.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-client.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-hive.jar /opt/cloudera/parcels/CDH/lib/hbase/lib/
14.
[ec2-user@ip-10-0-0-229 ~]$ hbase org.apache.phoenix.mapreduce.index.IndexTool --schema MENDELEY --data-table DOCUMENTS --index-table INDEX_PROFILE_ID --output-path /home/user/hadoop
15.
SLF4J: Class path contains multiple SLF4J bindings.
16.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/lib/phoenix-4.8.0-HBase-1.2-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
17.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/lib/phoenix-4.8.0-HBase-1.2-hive.jar!/org/slf4j/impl/StaticLoggerBinder.class]
18.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
19.
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
20.
16/08/18 05:01:56 INFO zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x63cd9962 connecting to ZooKeeperensemble=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181
21.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
22.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:host.name=ip-10-0-0-229.eu-west-1.compute.internal
23.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_67
24.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
25.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.7.0_67-cloudera/jre
26.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.class.path=<removed>
27.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/bin/../lib/native/Linux-amd64-64
28.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
29.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
30.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
31.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
32.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.version=3.10.0-327.el7.x86_64
33.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.name=ec2-user
34.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/ec2-user
35.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/ec2-user
36.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Initiating client connection,connectString=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181 sessionTimeout=60000 watcher=hconnection-0x63cd99620x0,quorum=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181, baseZNode=/hbase
37.
16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Opening socket connection to server 52.210.25.200/52.210.25.200:2181. Will not attempt to authenticate using SASL (unknown error)
38.
16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Socket connection established to 52.210.25.200/52.210.25.200:2181, initiating session
39.
16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Session establishment complete on server 52.210.25.200/52.210.25.200:2181, sessionid = 0x35678e40dae4cee, negotiated timeout = 60000
40.
16/08/18 05:01:57 INFO metrics.Metrics: Initializing metrics system: phoenix
41.
16/08/18 05:01:57 WARN impl.MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties
42.
16/08/18 05:01:57 INFO impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
43.
16/08/18 05:01:57 INFO impl.MetricsSystemImpl: phoenix metrics system started
44.
16/08/18 05:01:58 INFO Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
45.
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;
46.
at org.apache.phoenix.query.ConnectionQueryServicesImpl.generateTableDescriptor(ConnectionQueryServicesImpl.java:756)
47.
at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1020)
48.
at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1396)
49.
at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2302)
50.
at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:922)
51.
at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:194)
52.
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343)
53.
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331)
54.
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
55.
at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:329)
56.
at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1421)
57.
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2353)
58.
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2300)
59.
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)
60.
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2300)
61.
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:231)
62.
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:144)
63.
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
64.
at java.sql.DriverManager.getConnection(DriverManager.java:571)
65.
at java.sql.DriverManager.getConnection(DriverManager.java:187)
66.
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:98)
67.
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:57)
68.
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:45)
69.
at org.apache.phoenix.mapreduce.index.IndexTool.run(IndexTool.java:188)
70.
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
71.
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
72.
at org.apache.phoenix.mapreduce.index.IndexTool.main(IndexTool.java:394)
73.
[ec2-user@ip-10-0-0-229 ~]$
________________________________
Elsevier Limited. Registered Office: The Boulevard, Langford Lane, Kidlington, Oxford, OX5 1GB, United Kingdom, Registration No. 1982084, Registered in England and Wales.
Re: Unable to create secondary index with IndexTool - "java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;"
Posted by Aaron Molitor <am...@splicemachine.com>.
Tom,
While you're waiting for Cloudera you may be able to gain some deeper insight by looking around here: https://github.com/cloudera/hbase/tree/cdh5-1.2.0_5.7.2 <https://github.com/cloudera/hbase/tree/cdh5-1.2.0_5.7.2>
In my experience all of the platform companies (CDH, HDP, MapR) tend to incorporate patches to the published versions of the components in their deployments. The general idea makes sense to me, but can be challenging when one of those patches includes a breaking API change.
Good Luck!
-Aaron
> On Aug 18, 2016, at 06:02, Squires, Tom (ELS-LON) <to...@elsevier.com> wrote:
>
> Hi,
>
> We got to the bottom of this. The core of the issue is in the Cloudera version of HBase (hbase-client specifically): 1.2.0-cdh5.7.2 (see: https://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_vd_cdh5_maven_repo_57x.html <https://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_vd_cdh5_maven_repo_57x.html>).
>
> It looks like the 1.2.0-cdh5.7.2 version of hbase-client contains a different version of the problematic HTableDescriptor class: the signatures of the setValue(String, String)method contain different return types when comparing version 1.2.0 with version 1.2.0-cdh5.7.2.
>
> Phoenix's IndexTool expects the 1.2.0 version (which is bundled with the Phoenix client jar), but the Cloudera version was higher up on the classpath therefore causing the issue in my email below.
>
> Our workaround was to rename the Phoenix client jar so that it appears first on the classpath and is therefore the jar from which the class that gets loaded, which is the version the IndexTool expects.
>
> We plan on talking with Cloudera to gain a better understanding of the differences between 1.2.0 and 1.2.0-cdh5.7.2 of hbase-client: we'll update this thread if anything comes of that discussion.
>
> Regards,
> Tom
>
> From: Squires, Tom (ELS-LON)
> Sent: 18 August 2016 10:11
> To: user@phoenix.apache.org
> Cc: Narros, Eduardo (ELS-LON)
> Subject: Unable to create secondary index with IndexTool - "java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;"
>
> Hi,
>
> I am trying to use the org.apache.phoenix.mapreduce.index.IndexTool to create a secondary index on a table in our Phoenix cluster. We are using HBase 1.2 on Cloudera CDH 5.7.2.
>
> I downloaded the Phoenix 4.8.0 for HBase 1.2 binaries so that we had a version of IndexTool that is compatible with our HBase version, but I am getting the following error when running the tool:
>
> java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;
>
> This looks to me like the IndexTool is expecting a different version of HBase.
>
> Can anyone please advise? I have pasted shell output below.
>
> Many thanks,
> Tom
>
> [ec2-user@ip-10-0-0-229 ~]$ hbase version
> HBase 1.2.0-cdh5.7.2
> Source code repository file:///data/jenkins/workspace/generic-package-centos64-7-0/topdir/BUILD/hbase-1.2.0-cdh5.7.2 revision=Unknown
> Compiled by jenkins on Fri Jul 22 12:21:17 PDT 2016
> From source with checksum bc88ae0a54f047ea2506e04326e55353
> [ec2-user@ip-10-0-0-229 ~]$ hbase org.apache.phoenix.mapreduce.index.IndexTool --schema MENDELEY --data-table DOCUMENTS --index-table INDEX_PROFILE_ID --output-path /home/user/hadoop
> Error: Could not find or load main class org.apache.phoenix.mapreduce.index.IndexTool
> [ec2-user@ip-10-0-0-229 ~]$ find apache-phoenix-4.8.0-HBase-1.2-bin/ -name "*.jar" | xargs grep IndexTool.class
> Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-server.jar matches
> Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-core-4.8.0-HBase-1.2.jar matches
> Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-client.jar matches
> Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-hive.jar matches
> [ec2-user@ip-10-0-0-229 ~]$ sudo cp apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-server.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-core-4.8.0-HBase-1.2.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-client.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-hive.jar /opt/cloudera/parcels/CDH/lib/hbase/lib/
> [ec2-user@ip-10-0-0-229 ~]$ hbase org.apache.phoenix.mapreduce.index.IndexTool --schema MENDELEY --data-table DOCUMENTS --index-table INDEX_PROFILE_ID --output-path /home/user/hadoop
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/lib/phoenix-4.8.0-HBase-1.2-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/lib/phoenix-4.8.0-HBase-1.2-hive.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> 16/08/18 05:01:56 INFO zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x63cd9962 connecting to ZooKeeperensemble=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:host.name=ip-10-0-0-229.eu-west-1.compute.internal
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_67
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.7.0_67-cloudera/jre
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.class.path=<removed>
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/bin/../lib/native/Linux-amd64-64
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.version=3.10.0-327.el7.x86_64
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.name=ec2-user
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/ec2-user
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/ec2-user
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Initiating client connection,connectString=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181 sessionTimeout=60000 watcher=hconnection-0x63cd99620x0,quorum=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181, baseZNode=/hbase
> 16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Opening socket connection to server 52.210.25.200/52.210.25.200:2181. Will not attempt to authenticate using SASL (unknown error)
> 16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Socket connection established to 52.210.25.200/52.210.25.200:2181, initiating session
> 16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Session establishment complete on server 52.210.25.200/52.210.25.200:2181, sessionid = 0x35678e40dae4cee, negotiated timeout = 60000
> 16/08/18 05:01:57 INFO metrics.Metrics: Initializing metrics system: phoenix
> 16/08/18 05:01:57 WARN impl.MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties
> 16/08/18 05:01:57 INFO impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
> 16/08/18 05:01:57 INFO impl.MetricsSystemImpl: phoenix metrics system started
> 16/08/18 05:01:58 INFO Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.generateTableDescriptor(ConnectionQueryServicesImpl.java:756)
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1020)
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1396)
> at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2302)
> at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:922)
> at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:194)
> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343)
> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331)
> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:329)
> at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1421)
> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2353)
> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2300)
> at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2300)
> at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:231)
> at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:144)
> at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
> at java.sql.DriverManager.getConnection(DriverManager.java:571)
> at java.sql.DriverManager.getConnection(DriverManager.java:187)
> at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:98)
> at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:57)
> at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:45)
> at org.apache.phoenix.mapreduce.index.IndexTool.run(IndexTool.java:188)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> at org.apache.phoenix.mapreduce.index.IndexTool.main(IndexTool.java:394)
> [ec2-user@ip-10-0-0-229 ~]$
>
>
>
> Elsevier Limited. Registered Office: The Boulevard, Langford Lane, Kidlington, Oxford, OX5 1GB, United Kingdom, Registration No. 1982084, Registered in England and Wales.
Re: Unable to create secondary index with IndexTool -
"java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;"
Posted by "Squires, Tom (ELS-LON)" <to...@elsevier.com>.
Hi,
We got to the bottom of this. The core of the issue is in the Cloudera version of HBase (hbase-client specifically): 1.2.0-cdh5.7.2 (see: https://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_vd_cdh5_maven_repo_57x.html).
It looks like the 1.2.0-cdh5.7.2 version of hbase-client contains a different version of the problematic HTableDescriptor class: the signatures of the setValue(String, String) method contain different return types when comparing version 1.2.0 with version 1.2.0-cdh5.7.2.
Phoenix's IndexTool expects the 1.2.0 version (which is bundled with the Phoenix client jar), but the Cloudera version was higher up on the classpath therefore causing the issue in my email below.
Our workaround was to rename the Phoenix client jar so that it appears first on the classpath and is therefore the jar from which the class that gets loaded, which is the version the IndexTool expects.
We plan on talking with Cloudera to gain a better understanding of the differences between 1.2.0 and 1.2.0-cdh5.7.2 of hbase-client: we'll update this thread if anything comes of that discussion.
Regards,
Tom
________________________________
From: Squires, Tom (ELS-LON)
Sent: 18 August 2016 10:11
To: user@phoenix.apache.org
Cc: Narros, Eduardo (ELS-LON)
Subject: Unable to create secondary index with IndexTool - "java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;"
Hi,
I am trying to use the org.apache.phoenix.mapreduce.index.IndexTool to create a secondary index on a table in our Phoenix cluster. We are using HBase 1.2 on Cloudera CDH 5.7.2.
I downloaded the Phoenix 4.8.0 for HBase 1.2 binaries so that we had a version of IndexTool that is compatible with our HBase version, but I am getting the following error when running the tool:
java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;
This looks to me like the IndexTool is expecting a different version of HBase.
Can anyone please advise? I have pasted shell output below.
Many thanks,
Tom
1.
[ec2-user@ip-10-0-0-229 ~]$ hbase version
2.
HBase 1.2.0-cdh5.7.2
3.
Source code repository file:///data/jenkins/workspace/generic-package-centos64-7-0/topdir/BUILD/hbase-1.2.0-cdh5.7.2 revision=Unknown
4.
Compiled by jenkins on Fri Jul 22 12:21:17 PDT 2016
5.
From source with checksum bc88ae0a54f047ea2506e04326e55353
6.
[ec2-user@ip-10-0-0-229 ~]$ hbase org.apache.phoenix.mapreduce.index.IndexTool --schema MENDELEY --data-table DOCUMENTS --index-table INDEX_PROFILE_ID --output-path /home/user/hadoop
7.
Error: Could not find or load main class org.apache.phoenix.mapreduce.index.IndexTool
8.
[ec2-user@ip-10-0-0-229 ~]$ find apache-phoenix-4.8.0-HBase-1.2-bin/ -name "*.jar" | xargs grep IndexTool.class
9.
Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-server.jar matches
10.
Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-core-4.8.0-HBase-1.2.jar matches
11.
Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-client.jar matches
12.
Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-hive.jar matches
13.
[ec2-user@ip-10-0-0-229 ~]$ sudo cp apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-server.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-core-4.8.0-HBase-1.2.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-client.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-hive.jar /opt/cloudera/parcels/CDH/lib/hbase/lib/
14.
[ec2-user@ip-10-0-0-229 ~]$ hbase org.apache.phoenix.mapreduce.index.IndexTool --schema MENDELEY --data-table DOCUMENTS --index-table INDEX_PROFILE_ID --output-path /home/user/hadoop
15.
SLF4J: Class path contains multiple SLF4J bindings.
16.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/lib/phoenix-4.8.0-HBase-1.2-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
17.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/lib/phoenix-4.8.0-HBase-1.2-hive.jar!/org/slf4j/impl/StaticLoggerBinder.class]
18.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
19.
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
20.
16/08/18 05:01:56 INFO zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x63cd9962 connecting to ZooKeeperensemble=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181
21.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
22.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:host.name=ip-10-0-0-229.eu-west-1.compute.internal
23.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_67
24.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
25.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.7.0_67-cloudera/jre
26.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.class.path=<removed>
27.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/bin/../lib/native/Linux-amd64-64
28.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
29.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
30.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
31.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
32.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.version=3.10.0-327.el7.x86_64
33.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.name=ec2-user
34.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/ec2-user
35.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/ec2-user
36.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Initiating client connection,connectString=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181 sessionTimeout=60000 watcher=hconnection-0x63cd99620x0,quorum=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181, baseZNode=/hbase
37.
16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Opening socket connection to server 52.210.25.200/52.210.25.200:2181. Will not attempt to authenticate using SASL (unknown error)
38.
16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Socket connection established to 52.210.25.200/52.210.25.200:2181, initiating session
39.
16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Session establishment complete on server 52.210.25.200/52.210.25.200:2181, sessionid = 0x35678e40dae4cee, negotiated timeout = 60000
40.
16/08/18 05:01:57 INFO metrics.Metrics: Initializing metrics system: phoenix
41.
16/08/18 05:01:57 WARN impl.MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties
42.
16/08/18 05:01:57 INFO impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
43.
16/08/18 05:01:57 INFO impl.MetricsSystemImpl: phoenix metrics system started
44.
16/08/18 05:01:58 INFO Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
45.
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;
46.
at org.apache.phoenix.query.ConnectionQueryServicesImpl.generateTableDescriptor(ConnectionQueryServicesImpl.java:756)
47.
at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1020)
48.
at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1396)
49.
at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2302)
50.
at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:922)
51.
at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:194)
52.
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343)
53.
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331)
54.
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
55.
at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:329)
56.
at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1421)
57.
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2353)
58.
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2300)
59.
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)
60.
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2300)
61.
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:231)
62.
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:144)
63.
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
64.
at java.sql.DriverManager.getConnection(DriverManager.java:571)
65.
at java.sql.DriverManager.getConnection(DriverManager.java:187)
66.
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:98)
67.
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:57)
68.
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:45)
69.
at org.apache.phoenix.mapreduce.index.IndexTool.run(IndexTool.java:188)
70.
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
71.
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
72.
at org.apache.phoenix.mapreduce.index.IndexTool.main(IndexTool.java:394)
73.
[ec2-user@ip-10-0-0-229 ~]$
________________________________
Elsevier Limited. Registered Office: The Boulevard, Langford Lane, Kidlington, Oxford, OX5 1GB, United Kingdom, Registration No. 1982084, Registered in England and Wales.
Re: Unable to create secondary index with IndexTool -
"java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;"
Posted by "Heather, James (ELS-LON)" <ja...@elsevier.com>.
This usually happens when you're running a fat jar and there's been a dependency versioning conflict while building it. If part of the project (e.g., Phoenix) is built against one version of a lib (e.g., hbase) and another part of the project is built against a different version, only one version ends up inside the jar, and things break.
You'll probably need to shade something somewhere.
On 18 August 2016 10:12:10 a.m. "Squires, Tom (ELS-LON)" <to...@elsevier.com> wrote:
Hi,
I am trying to use the org.apache.phoenix.mapreduce.index.IndexTool to create a secondary index on a table in our Phoenix cluster. We are using HBase 1.2 on Cloudera CDH 5.7.2.
I downloaded the Phoenix 4.8.0 for HBase 1.2 binaries so that we had a version of IndexTool that is compatible with our HBase version, but I am getting the following error when running the tool:
java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;
This looks to me like the IndexTool is expecting a different version of HBase.
Can anyone please advise? I have pasted shell output below.
Many thanks,
Tom
1.
[ec2-user@ip-10-0-0-229 ~]$ hbase version
2.
HBase 1.2.0-cdh5.7.2
3.
Source code repository file:///data/jenkins/workspace/generic-package-centos64-7-0/topdir/BUILD/hbase-1.2.0-cdh5.7.2 revision=Unknown
4.
Compiled by jenkins on Fri Jul 22 12:21:17 PDT 2016
5.
From source with checksum bc88ae0a54f047ea2506e04326e55353
6.
[ec2-user@ip-10-0-0-229 ~]$ hbase org.apache.phoenix.mapreduce.index.IndexTool --schema MENDELEY --data-table DOCUMENTS --index-table INDEX_PROFILE_ID --output-path /home/user/hadoop
7.
Error: Could not find or load main class org.apache.phoenix.mapreduce.index.IndexTool
8.
[ec2-user@ip-10-0-0-229 ~]$ find apache-phoenix-4.8.0-HBase-1.2-bin/ -name "*.jar" | xargs grep IndexTool.class
9.
Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-server.jar matches
10.
Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-core-4.8.0-HBase-1.2.jar matches
11.
Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-client.jar matches
12.
Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-hive.jar matches
13.
[ec2-user@ip-10-0-0-229 ~]$ sudo cp apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-server.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-core-4.8.0-HBase-1.2.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-client.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-hive.jar /opt/cloudera/parcels/CDH/lib/hbase/lib/
14.
[ec2-user@ip-10-0-0-229 ~]$ hbase org.apache.phoenix.mapreduce.index.IndexTool --schema MENDELEY --data-table DOCUMENTS --index-table INDEX_PROFILE_ID --output-path /home/user/hadoop
15.
SLF4J: Class path contains multiple SLF4J bindings.
16.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/lib/phoenix-4.8.0-HBase-1.2-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
17.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/lib/phoenix-4.8.0-HBase-1.2-hive.jar!/org/slf4j/impl/StaticLoggerBinder.class]
18.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
19.
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
20.
16/08/18 05:01:56 INFO zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x63cd9962 connecting to ZooKeeperensemble=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181
21.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
22.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:host.name=ip-10-0-0-229.eu-west-1.compute.internal
23.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_67
24.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
25.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.7.0_67-cloudera/jre
26.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.class.path=<removed>
27.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/bin/../lib/native/Linux-amd64-64
28.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
29.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
30.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
31.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
32.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.version=3.10.0-327.el7.x86_64
33.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.name=ec2-user
34.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/ec2-user
35.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/ec2-user
36.
16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Initiating client connection,connectString=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181 sessionTimeout=60000 watcher=hconnection-0x63cd99620x0,quorum=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181, baseZNode=/hbase
37.
16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Opening socket connection to server 52.210.25.200/52.210.25.200:2181. Will not attempt to authenticate using SASL (unknown error)
38.
16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Socket connection established to 52.210.25.200/52.210.25.200:2181, initiating session
39.
16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Session establishment complete on server 52.210.25.200/52.210.25.200:2181, sessionid = 0x35678e40dae4cee, negotiated timeout = 60000
40.
16/08/18 05:01:57 INFO metrics.Metrics: Initializing metrics system: phoenix
41.
16/08/18 05:01:57 WARN impl.MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties
42.
16/08/18 05:01:57 INFO impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
43.
16/08/18 05:01:57 INFO impl.MetricsSystemImpl: phoenix metrics system started
44.
16/08/18 05:01:58 INFO Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
45.
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;
46.
at org.apache.phoenix.query.ConnectionQueryServicesImpl.generateTableDescriptor(ConnectionQueryServicesImpl.java:756)
47.
at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1020)
48.
at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1396)
49.
at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2302)
50.
at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:922)
51.
at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:194)
52.
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343)
53.
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331)
54.
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
55.
at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:329)
56.
at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1421)
57.
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2353)
58.
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2300)
59.
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)
60.
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2300)
61.
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:231)
62.
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:144)
63.
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
64.
at java.sql.DriverManager.getConnection(DriverManager.java:571)
65.
at java.sql.DriverManager.getConnection(DriverManager.java:187)
66.
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:98)
67.
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:57)
68.
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:45)
69.
at org.apache.phoenix.mapreduce.index.IndexTool.run(IndexTool.java:188)
70.
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
71.
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
72.
at org.apache.phoenix.mapreduce.index.IndexTool.main(IndexTool.java:394)
73.
[ec2-user@ip-10-0-0-229 ~]$
________________________________
Elsevier Limited. Registered Office: The Boulevard, Langford Lane, Kidlington, Oxford, OX5 1GB, United Kingdom, Registration No. 1982084, Registered in England and Wales.
________________________________
Elsevier Limited. Registered Office: The Boulevard, Langford Lane, Kidlington, Oxford, OX5 1GB, United Kingdom, Registration No. 1982084, Registered in England and Wales.