You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by "M. Aaron Bossert" <ma...@gmail.com> on 2018/11/27 15:48:21 UTC

client does not have phoenix.schema.isNamespaceMappingEnabled

Folks,

I have, I believe, followed all the directions for turning on namespace
mapping as well as extra steps to (added classpath) required to use the
mapreduce bulk load utility, but am still running into this error...I am
running a Hortonworks cluster with both HDP v 3.0.1 and HDF components.
Here is what I have tried:


   - Checked that the proper hbase-site.xml (in my case:
   /etc/hbase/3.0.1.0-187/0/hbase-site.xml) file is being referenced when
   launching the mapreduce utility:


    ...


    <property>

      <name>phoenix.schema.isNamespaceMappingEnabled</name>

      <value>true</value>

    </property>



    <property>

      <name>phoenix.schema.mapSystemTablesToNamespace</name>

      <value>true</value>

    </property>


    ...

   - added the appropriate classpath additions to the hadoop jar command
   (zookeeper quorum hostnames changed to remove my corporate network info as
   well as data directory):

HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/3.0.1.0-187/0/hbase-site.xml
hadoop jar
/usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
/ingest/MYCSV -z zk1,zk2,zk3 -g


...


18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper
connection 0x1d58d65f to master-1.punch.datareservoir.net:2181,
master-2.punch.datareservoir.net:2181,master-3.punch.datareservoir.net:2181

18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down
QueryLoggerDisruptor..

Exception in thread "main" java.sql.SQLException: ERROR 726 (43M10):
Inconsistent
namespace mapping properties. Cannot initiate connection as SYSTEM:CATALOG
is found but client does not have phoenix.schema.isNamespaceMappingEnabled
enabled

at
org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)

at
org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)

at
org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)

at
org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)

at
org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)

at
org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)

at
org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)

at
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)

at
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)

at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)

at
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)

at
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)

at
org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)

at
org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)

at
org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)

at
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)

at
org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)

at
org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)

at
org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)

at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)

at java.sql.DriverManager.getConnection(DriverManager.java:664)

at java.sql.DriverManager.getConnection(DriverManager.java:208)

at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)

at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)

at
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)

at
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)

at
org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.hadoop.util.RunJar.run(RunJar.java:318)

at org.apache.hadoop.util.RunJar.main(RunJar.java:232)

18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session: 0x3672eebffa800c8
closed

18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down

   - Also tried the other recommended option:

HADOOP_CLASSPATH=$(hbase
mapredcp):/etc/hbase/3.0.1.0-187/0/hbase-site.xml hadoop
jar /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
/ingest/MYCSV -z zk1,zk2,zk3 -g


...


18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper
connection 0x1d58d65f to master-1.punch.datareservoir.net:2181,
master-2.punch.datareservoir.net:2181,master-3.punch.datareservoir.net:2181

18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down
QueryLoggerDisruptor..

Exception in thread "main" java.sql.SQLException: ERROR 726 (43M10):
Inconsistent
namespace mapping properties. Cannot initiate connection as SYSTEM:CATALOG
is found but client does not have phoenix.schema.isNamespaceMappingEnabled
enabled

at
org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)

at
org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)

at
org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)

at
org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)

at
org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)

at
org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)

at
org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)

at
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)

at
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)

at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)

at
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)

at
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)

at
org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)

at
org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)

at
org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)

at
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)

at
org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)

at
org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)

at
org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)

at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)

at java.sql.DriverManager.getConnection(DriverManager.java:664)

at java.sql.DriverManager.getConnection(DriverManager.java:208)

at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)

at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)

at
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)

at
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)

at
org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.hadoop.util.RunJar.run(RunJar.java:318)

at org.apache.hadoop.util.RunJar.main(RunJar.java:232)

18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session: 0x3672eebffa800c8
closed

18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down

   - As well as the recommended approach in the HBase reference guide
   linked in the Phoenix docs:

HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar
/usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
/ingest/MYCSV -z zk1,zk2,zk3 -g


Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/commons/cli/DefaultParser

at
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.parseOptions(AbstractBulkLoadTool.java:128)

at
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:176)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)

at
org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.hadoop.util.RunJar.run(RunJar.java:318)

at org.apache.hadoop.util.RunJar.main(RunJar.java:232)

Caused by: java.lang.ClassNotFoundException:
org.apache.commons.cli.DefaultParser

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 11 more

   - And finally, here is what the tables look like in both Hbase shell and
   sqlline:

hbase shell

HBase Shell

Use "help" to get list of supported commands.

Use "exit" to quit this interactive shell.

Version 2.0.0.3.0.1.0-187, re9fcf450949102de5069b257a6dee469b8f5aab3, Wed
Sep 19 10:16:35 UTC 2018

Took 0.0016 seconds




hbase(main):001:0> list

TABLE




ATLAS_ENTITY_AUDIT_EVENTS




MYTABLE




SYSTEM:CATALOG




SYSTEM:FUNCTION




SYSTEM:LOG




SYSTEM:MUTEX




SYSTEM:SEQUENCE




SYSTEM:STATS




atlas_janus




9 row(s)

Took 0.6114 seconds




=> ["ATLAS_ENTITY_AUDIT_EVENTS", "MYTABLE", "SYSTEM:CATALOG",
"SYSTEM:FUNCTION", "SYSTEM:LOG", "SYSTEM:MUTEX", "SYSTEM:SEQUENCE",
"SYSTEM:STATS", "atlas_janus"]





phoenix-sqlline master-1.punch.datareservoir.net

*Setting property: [incremental, false]*

*Setting property: [isolation, TRANSACTION_READ_COMMITTED]*

*issuing: !connect jdbc:phoenix:mysrv none none
org.apache.phoenix.jdbc.PhoenixDriver*

*Connecting to jdbc:phoenix:mysrv*

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in
[jar:file:/usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in
[jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.

18/11/27 15:45:51 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable

*Connected to: Phoenix (version 5.0)*

*Driver: PhoenixEmbeddedDriver (version 5.0)*

*Autocommit status: true*

*Transaction isolation: TRANSACTION_READ_COMMITTED*

Building list of tables and columns for tab-completion (set fastconnect to
true to skip)...

144/144 (100%) Done

Done

sqlline version 1.2.0

0: jdbc:phoenix:mysrv> !tables

*+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*

*| **TABLE_CAT ** | **TABLE_SCHEM ** | **TABLE_NAME ** | ** TABLE_TYPE  **
| **REMARKS ** | **TYPE_NAME ** | **SELF_REFERENCING_COL_NAME ** | *
*REF_GENERATION ** | **INDEX_STATE ** | **IMMUTABLE_ROWS ** | *
*SALT_BUCKETS ** | **MULTI_TENANT ** | **VIEW_STATEMENT ** | **VIEW_TYPE **
| **INDEX_T** |*

*+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*

*| *          * | *SYSTEM      * | *CATALOG    * | *SYSTEM TABLE * | *
   * | *          * | *                          * | *               * | *
         * | *false          * | *null         * | *false        * | *
         * | *          * | *       * |*

*| *          * | *SYSTEM      * | *FUNCTION   * | *SYSTEM TABLE * | *
   * | *          * | *                          * | *               * | *
         * | *false          * | *null         * | *false        * | *
         * | *          * | *       * |*

*| *          * | *SYSTEM      * | *LOG        * | *SYSTEM TABLE * | *
   * | *          * | *                          * | *               * | *
         * | *true           * | *32           * | *false        * | *
         * | *          * | *       * |*

*| *          * | *SYSTEM      * | *SEQUENCE   * | *SYSTEM TABLE * | *
   * | *          * | *                          * | *               * | *
         * | *false          * | *null         * | *false        * | *
         * | *          * | *       * |*

*| *          * | *SYSTEM      * | *STATS      * | *SYSTEM TABLE * | *
   * | *          * | *                          * | *               * | *
         * | *false          * | *null         * | *false        * | *
         * | *          * | *       * |*

*| *          * | *            * | *MYTABLE   * | *TABLE        * | *
 * | *          * | *                          * | *               * | *
         * | *false          * | *5            * | *false        * | *
         * | *          * | *       * |*

*+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*

Re: client does not have phoenix.schema.isNamespaceMappingEnabled

Posted by Ajit Bhingarkar <aj...@nielsen.com>.
user-unsubscribe@phoenix.apache.org

On Fri, Nov 30, 2018 at 12:04 AM M. Aaron Bossert <ma...@gmail.com>
wrote:

> So, sorry for the super late reply...there is weird lag between the time a
> message is sent or received to this mailing list and when I actually see
> it...But, I have got it working now as follows:
>
>
> HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/
> 3.0.1. <http://3.0.0.1/>0-187/0/ hadoop jar ...
>
> using this did not work:
>
> HADOOP_CLASSPATH="$(hbase mapredcp)" hadoop jar ...
>
>
> the output of that command separately is this:
>
> [user@server /somedir $] [mabossert@edge-3 lanl_data]$ hbase mapredcp
>
>
> /usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-protobuf-2.1.0.jar:/usr/hdp/3.0.1.0-187/zookeeper/zookeeper-3.4.6.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/htrace-core4-4.2.0-incubating.jar:/usr/hdp/3.0.1.0-187/hbase/lib/commons-lang3-3.6.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-server-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol-shaded-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-hadoop2-compat-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-mapreduce-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-metrics-api-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/metrics-core-3.2.1.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-databind-2.9.5.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-client-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-hadoop-compat-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-netty-2.1.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-miscellaneous-2.1.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-metrics-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-common-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-zookeeper-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-annotations-2.9.5.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-core-2.9.5.jar
>
> On Tue, Nov 27, 2018 at 4:26 PM Josh Elser <el...@apache.org> wrote:
>
>> To add a non-jar file to the classpath of a Java application, you must
>> add the directory containing that file to the classpath.
>>
>> Thus, the following is wrong:
>>
>> HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/
>> 3.0.1.0-187/0/hbase-site.xml
>>
>> And should be:
>>
>> HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/
>> 3.0.1.0-187/0/
>>
>> Most times, including the output of `hbase mapredcp` is sufficient ala
>>
>> HADOOP_CLASSPATH="$(hbase mapredcp)" hadoop jar ...
>>
>> On 11/27/18 10:48 AM, M. Aaron Bossert wrote:
>> > Folks,
>> >
>> > I have, I believe, followed all the directions for turning on namespace
>> > mapping as well as extra steps to (added classpath) required to use the
>> > mapreduce bulk load utility, but am still running into this error...I
>> am
>> > running a Hortonworks cluster with both HDP v 3.0.1 and HDF
>> components.
>> > Here is what I have tried:
>> >
>> >   * Checked that the proper hbase-site.xml (in my case:
>> >     /etc/hbase/3.0.1.0-187/0/hbase-site.xml) file is being referenced
>> >     when launching the mapreduce utility:
>> >
>> >
>> >      ...
>> >
>> >
>> > <property>
>> >
>> > <name>phoenix.schema.isNamespaceMappingEnabled</name>
>> >
>> > <value>true</value>
>> >
>> > </property>
>> >
>> > <property>
>> >
>> > <name>phoenix.schema.mapSystemTablesToNamespace</name>
>> >
>> > <value>true</value>
>> >
>> > </property>
>> >
>> >
>> >      ...
>> >
>> >   * added the appropriate classpath additions to the hadoop jar command
>> >     (zookeeper quorum hostnames changed to remove my corporate network
>> >     info as well as data directory):
>> >
>> >
>> HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/
>> 3.0.1.0-187/0/hbase-site.xml
>> > hadoop jar
>> > /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
>> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
>> > /ingest/MYCSV -z zk1,zk2,zk3 -g
>> >
>> >
>> > ...
>> >
>> >
>> > 18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper
>> > connection 0x1d58d65f to master-1.punch.datareservoir.net:2181
>> > <http://master-1.punch.datareservoir.net:2181>,
>> master-2.punch.datareservoir.net:2181
>> > <http://master-2.punch.datareservoir.net:2181>,
>> master-3.punch.datareservoir.net:2181
>> > <http://master-3.punch.datareservoir.net:2181>
>> >
>> > 18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down
>> > QueryLoggerDisruptor..
>> >
>> > Exception in thread "main" java.sql.SQLException: ERROR 726
>> > (43M10):Inconsistent namespace mapping properties. Cannot initiate
>> > connection as SYSTEM:CATALOG is found but client does not have
>> > phoenix.schema.isNamespaceMappingEnabled enabled
>> >
>> > at
>> >
>> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
>> >
>> > at
>> >
>> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
>> >
>> > at
>> >
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)
>> >
>> > at
>> >
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
>> >
>> > at
>> >
>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)
>> >
>> > at
>> >
>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
>> >
>> > at
>> >
>> org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
>> >
>> > at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
>> >
>> > at
>> >
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
>> >
>> > at
>> >
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
>> >
>> > at
>> >
>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
>> >
>> > at
>> >
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
>> >
>> > at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
>> >
>> > at java.sql.DriverManager.getConnection(DriverManager.java:664)
>> >
>> > at java.sql.DriverManager.getConnection(DriverManager.java:208)
>> >
>> > at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
>> >
>> > at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
>> >
>> > at
>> >
>> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
>> >
>> > at
>> >
>> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
>> >
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>> >
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>> >
>> > at
>> >
>> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
>> >
>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >
>> > at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> >
>> > at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> >
>> > at java.lang.reflect.Method.invoke(Method.java:498)
>> >
>> > at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>> >
>> > at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>> >
>> > 18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session: 0x3672eebffa800c8
>> > closed
>> >
>> > 18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
>> >
>> >   * Also tried the other recommended option:
>> >
>> > HADOOP_CLASSPATH=$(hbase
>> > mapredcp):/etc/hbase/3.0.1.0-187/0/hbase-site.xml hadoop jar
>> > /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
>> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
>> > /ingest/MYCSV -z zk1,zk2,zk3 -g
>> >
>> >
>> > ...
>> >
>> >
>> > 18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper
>> > connection 0x1d58d65f to master-1.punch.datareservoir.net:2181
>> > <http://master-1.punch.datareservoir.net:2181>,
>> master-2.punch.datareservoir.net:2181
>> > <http://master-2.punch.datareservoir.net:2181>,
>> master-3.punch.datareservoir.net:2181
>> > <http://master-3.punch.datareservoir.net:2181>
>> >
>> > 18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down
>> > QueryLoggerDisruptor..
>> >
>> > Exception in thread "main" java.sql.SQLException: ERROR 726
>> > (43M10):Inconsistent namespace mapping properties. Cannot initiate
>> > connection as SYSTEM:CATALOG is found but client does not have
>> > phoenix.schema.isNamespaceMappingEnabled enabled
>> >
>> > at
>> >
>> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
>> >
>> > at
>> >
>> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
>> >
>> > at
>> >
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)
>> >
>> > at
>> >
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
>> >
>> > at
>> >
>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)
>> >
>> > at
>> >
>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
>> >
>> > at
>> >
>> org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
>> >
>> > at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
>> >
>> > at
>> >
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
>> >
>> > at
>> >
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
>> >
>> > at
>> >
>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
>> >
>> > at
>> >
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
>> >
>> > at
>> >
>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
>> >
>> > at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
>> >
>> > at java.sql.DriverManager.getConnection(DriverManager.java:664)
>> >
>> > at java.sql.DriverManager.getConnection(DriverManager.java:208)
>> >
>> > at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
>> >
>> > at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
>> >
>> > at
>> >
>> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
>> >
>> > at
>> >
>> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
>> >
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>> >
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>> >
>> > at
>> >
>> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
>> >
>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >
>> > at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> >
>> > at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> >
>> > at java.lang.reflect.Method.invoke(Method.java:498)
>> >
>> > at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>> >
>> > at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>> >
>> > 18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session: 0x3672eebffa800c8
>> > closed
>> >
>> > 18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
>> >
>> >   * As well as the recommended approach in the HBase reference guide
>> >     linked in the Phoenix docs:
>> >
>> > HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar
>> > /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
>> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
>> > /ingest/MYCSV -z zk1,zk2,zk3 -g
>> >
>> >
>> > Exception in thread "main" java.lang.NoClassDefFoundError:
>> > org/apache/commons/cli/DefaultParser
>> >
>> > at
>> >
>> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.parseOptions(AbstractBulkLoadTool.java:128)
>> >
>> > at
>> >
>> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:176)
>> >
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>> >
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>> >
>> > at
>> >
>> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
>> >
>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >
>> > at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> >
>> > at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> >
>> > at java.lang.reflect.Method.invoke(Method.java:498)
>> >
>> > at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>> >
>> > at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>> >
>> > Caused by: java.lang.ClassNotFoundException:
>> > org.apache.commons.cli.DefaultParser
>> >
>> > at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>> >
>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>> >
>> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>> >
>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>> >
>> > ... 11 more
>> >
>> >   * And finally, here is what the tables look like in both Hbase shell
>> >     and sqlline:
>> >
>> > hbase shell
>> >
>> > HBase Shell
>> >
>> > Use "help" to get list of supported commands.
>> >
>> > Use "exit" to quit this interactive shell.
>> >
>> > Version 2.0.0.3.0.1.0-187, re9fcf450949102de5069b257a6dee469b8f5aab3,
>> > Wed Sep 19 10:16:35 UTC 2018
>> >
>> > Took 0.0016 seconds
>> >
>> > hbase(main):001:0> list
>> >
>> > TABLE
>> >
>> > ATLAS_ENTITY_AUDIT_EVENTS
>> >
>> > MYTABLE
>> >
>> > SYSTEM:CATALOG
>> >
>> > SYSTEM:FUNCTION
>> >
>> > SYSTEM:LOG
>> >
>> > SYSTEM:MUTEX
>> >
>> > SYSTEM:SEQUENCE
>> >
>> > SYSTEM:STATS
>> >
>> > atlas_janus
>> >
>> > 9 row(s)
>> >
>> > Took 0.6114 seconds
>> >
>> > => ["ATLAS_ENTITY_AUDIT_EVENTS", "MYTABLE", "SYSTEM:CATALOG",
>> > "SYSTEM:FUNCTION", "SYSTEM:LOG", "SYSTEM:MUTEX", "SYSTEM:SEQUENCE",
>> > "SYSTEM:STATS", "atlas_janus"]
>> >
>> >
>> >
>> >
>> >
>> > phoenix-sqlline master-1.punch.datareservoir.net
>> > <http://master-1.punch.datareservoir.net>
>> >
>> > *Setting property: [incremental, false]*
>> >
>> > *Setting property: [isolation, TRANSACTION_READ_COMMITTED]*
>> >
>> > *issuing: !connect jdbc:phoenix:mysrv none none
>> > org.apache.phoenix.jdbc.PhoenixDriver*
>> >
>> > *Connecting to jdbc:phoenix:mysrv*
>> >
>> > SLF4J: Class path contains multiple SLF4J bindings.
>> >
>> > SLF4J: Found binding in
>> >
>> [jar:file:/usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> >
>> > SLF4J: Found binding in
>> >
>> [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> >
>> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> > explanation.
>> >
>> > 18/11/27 15:45:51 WARN util.NativeCodeLoader: Unable to load
>> > native-hadoop library for your platform... using builtin-java classes
>> > where applicable
>> >
>> > *Connected to: Phoenix (version 5.0)*
>> >
>> > *Driver: PhoenixEmbeddedDriver (version 5.0)*
>> >
>> > *Autocommit status: true*
>> >
>> > *Transaction isolation: TRANSACTION_READ_COMMITTED*
>> >
>> > Building list of tables and columns for tab-completion (set fastconnect
>> > to true to skip)...
>> >
>> > 144/144 (100%) Done
>> >
>> > Done
>> >
>> > sqlline version 1.2.0
>> >
>> > 0: jdbc:phoenix:mysrv> !tables
>> >
>> >
>> *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
>> >
>> > *| **TABLE_CAT**| **TABLE_SCHEM**| **TABLE_NAME**|**TABLE_TYPE **|
>> > **REMARKS**| **TYPE_NAME**| **SELF_REFERENCING_COL_NAME**|
>> > **REF_GENERATION**| **INDEX_STATE**| **IMMUTABLE_ROWS**|
>> > **SALT_BUCKETS**| **MULTI_TENANT**| **VIEW_STATEMENT**| **VIEW_TYPE**|
>> > **INDEX_T**|*
>> >
>> >
>> *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
>> >
>> > *|**| *SYSTEM *| *CATALOG *| *SYSTEM TABLE*|**|**|**| **|**| *false *|
>> > *null*| *false *| **|**| **|*
>> >
>> > *|**| *SYSTEM *| *FUNCTION*| *SYSTEM TABLE*|**|**|**| **|**| *false *|
>> > *null*| *false *| **|**| **|*
>> >
>> > *|**| *SYSTEM *| *LOG *| *SYSTEM TABLE*|**|**|**| **|**| *true*| *32*|
>> > *false *| **|**| **|*
>> >
>> > *|**| *SYSTEM *| *SEQUENCE*| *SYSTEM TABLE*|**|**|**| **|**| *false *|
>> > *null*| *false *| **|**| **|*
>> >
>> > *|**| *SYSTEM *| *STATS *| *SYSTEM TABLE*|**|**|**| **|**| *false *|
>> > *null*| *false *| **|**| **|*
>> >
>> > *|**|**| *MYTABLE*| *TABLE *|**|**|**| **|**| *false *| *5 *| *false *|
>> > **|**| **|*
>> >
>> >
>> *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
>> >
>>
>

Re: client does not have phoenix.schema.isNamespaceMappingEnabled

Posted by "M. Aaron Bossert" <ma...@gmail.com>.
Gotcha....perhaps I was misunderstanding the tutorial that was suggesting
the HADOOP_CLASSPATH="$(hbase mapredcp)" hadoop jar ... approach...Thanks
for all the help folks...very appreciated!

On Thu, Nov 29, 2018 at 3:13 PM Josh Elser <el...@apache.org> wrote:

> Why didn't it work?
>
> The hbase-protocol.jar is insufficient to run MapReduce jobs against
> HBase; full stop. You're going to get lots of stuff pulled in via the
> phoenix-client.jar that you give to `hadoop jar`. That said, I can't
> think of a reason that including more jars on the classpath would be
> harmful.
>
> Realistically, you might only need to provide HBASE_CONF_DIR to the
> HADOOP_CLASSPATH env variable, so that your mappers and reducers also
> get it on their classpath. The rest of the Java classes would be
> automatically localized via `hadoop jar`.
>
> On 11/29/18 1:27 PM, M. Aaron Bossert wrote:
> > So, sorry for the super late reply...there is weird lag between the time
> > a message is sent or received to this mailing list and when I actually
> > see it...But, I have got it working now as follows:
> >
> >
> HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/
> 3.0.1.
> > <http://3.0.0.1/>0-187/0/ hadoop jar ...
> >
> > using this did not work:
> >
> > HADOOP_CLASSPATH="$(hbase mapredcp)" hadoop jar ...
> >
> >
> > the output of that command separately is this:
> >
> > [user@server /somedir $] [mabossert@edge-3 lanl_data]$ hbase mapredcp
> >
> >
> /usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-protobuf-2.1.0.jar:/usr/hdp/3.0.1.0-187/zookeeper/zookeeper-3.4.6.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/htrace-core4-4.2.0-incubating.jar:/usr/hdp/3.0.1.0-187/hbase/lib/commons-lang3-3.6.jar:/usr/hdp/
> 3.0.1.
> 0-187/hbase/lib/hbase-server-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol-shaded-2.0.0.3.0.1.0-187.jar:/usr/hdp/
> 3.0.1.
> 0-187/hbase/lib/hbase-hadoop2-compat-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-mapreduce-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-metrics-api-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/metrics-core-3.2.1.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-databind-2.9.5.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-client-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-hadoop-compat-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-netty-2.1.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-miscellaneous-2.1.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-metrics-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-common-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-zookeeper-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-annotations-2.9.5.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-core-2.9.5.jar
> >
> >
> > On Tue, Nov 27, 2018 at 4:26 PM Josh Elser <elserj@apache.org
> > <ma...@apache.org>> wrote:
> >
> >     To add a non-jar file to the classpath of a Java application, you
> must
> >     add the directory containing that file to the classpath.
> >
> >     Thus, the following is wrong:
> >
>  HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/
> 3.0.1.
> >     <http://3.0.1.>0-187/0/hbase-site.xml
> >
> >     And should be:
> >
>  HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/
> 3.0.1.
> >     <http://3.0.1.>0-187/0/
> >
> >     Most times, including the output of `hbase mapredcp` is sufficient
> ala
> >
> >     HADOOP_CLASSPATH="$(hbase mapredcp)" hadoop jar ...
> >
> >     On 11/27/18 10:48 AM, M. Aaron Bossert wrote:
> >      > Folks,
> >      >
> >      > I have, I believe, followed all the directions for turning on
> >     namespace
> >      > mapping as well as extra steps to (added classpath) required to
> >     use the
> >      > mapreduce bulk load utility, but am still running into this
> >     error...I am
> >      > running a Hortonworks cluster with both HDP v 3.0.1 and HDF
> >     components.
> >      > Here is what I have tried:
> >      >
> >      >   * Checked that the proper hbase-site.xml (in my case:
> >      >     /etc/hbase/3.0.1.0-187/0/hbase-site.xml) file is being
> referenced
> >      >     when launching the mapreduce utility:
> >      >
> >      >
> >      >      ...
> >      >
> >      >
> >      > <property>
> >      >
> >      > <name>phoenix.schema.isNamespaceMappingEnabled</name>
> >      >
> >      > <value>true</value>
> >      >
> >      > </property>
> >      >
> >      > <property>
> >      >
> >      > <name>phoenix.schema.mapSystemTablesToNamespace</name>
> >      >
> >      > <value>true</value>
> >      >
> >      > </property>
> >      >
> >      >
> >      >      ...
> >      >
> >      >   * added the appropriate classpath additions to the hadoop jar
> >     command
> >      >     (zookeeper quorum hostnames changed to remove my corporate
> >     network
> >      >     info as well as data directory):
> >      >
> >      >
> >
>  HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/
> 3.0.1.
> >     <http://3.0.1.>0-187/0/hbase-site.xml
> >      > hadoop jar
> >      > /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
> >      > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE
> --input
> >      > /ingest/MYCSV -z zk1,zk2,zk3 -g
> >      >
> >      >
> >      > ...
> >      >
> >      >
> >      > 18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper
> >      > connection 0x1d58d65f to master-1.punch.datareservoir.net:2181
> >     <http://master-1.punch.datareservoir.net:2181>
> >      >
> >     <http://master-1.punch.datareservoir.net:2181>,
> master-2.punch.datareservoir.net:2181
> >     <http://master-2.punch.datareservoir.net:2181>
> >      >
> >     <http://master-2.punch.datareservoir.net:2181>,
> master-3.punch.datareservoir.net:2181
> >     <http://master-3.punch.datareservoir.net:2181>
> >      > <http://master-3.punch.datareservoir.net:2181>
> >      >
> >      > 18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down
> >      > QueryLoggerDisruptor..
> >      >
> >      > Exception in thread "main" java.sql.SQLException: ERROR 726
> >      > (43M10):Inconsistent namespace mapping properties. Cannot initiate
> >      > connection as SYSTEM:CATALOG is found but client does not have
> >      > phoenix.schema.isNamespaceMappingEnabled enabled
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
> >      >
> >      > at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
> >      >
> >      > at
> >     org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
> >      >
> >      > at java.sql.DriverManager.getConnection(DriverManager.java:664)
> >      >
> >      > at java.sql.DriverManager.getConnection(DriverManager.java:208)
> >      >
> >      > at
> >     org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
> >      >
> >      > at
> >     org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
> >      >
> >      > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> >      >
> >      > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
> >      >
> >      > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >      >
> >      > at
> >      >
> >
>  sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >      >
> >      > at
> >      >
> >
>  sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >      >
> >      > at java.lang.reflect.Method.invoke(Method.java:498)
> >      >
> >      > at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
> >      >
> >      > at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
> >      >
> >      > 18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session:
> >     0x3672eebffa800c8
> >      > closed
> >      >
> >      > 18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
> >      >
> >      >   * Also tried the other recommended option:
> >      >
> >      > HADOOP_CLASSPATH=$(hbase
> >      > mapredcp):/etc/hbase/3.0.1.0-187/0/hbase-site.xml hadoop jar
> >      > /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
> >      > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE
> --input
> >      > /ingest/MYCSV -z zk1,zk2,zk3 -g
> >      >
> >      >
> >      > ...
> >      >
> >      >
> >      > 18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper
> >      > connection 0x1d58d65f to master-1.punch.datareservoir.net:2181
> >     <http://master-1.punch.datareservoir.net:2181>
> >      >
> >     <http://master-1.punch.datareservoir.net:2181>,
> master-2.punch.datareservoir.net:2181
> >     <http://master-2.punch.datareservoir.net:2181>
> >      >
> >     <http://master-2.punch.datareservoir.net:2181>,
> master-3.punch.datareservoir.net:2181
> >     <http://master-3.punch.datareservoir.net:2181>
> >      > <http://master-3.punch.datareservoir.net:2181>
> >      >
> >      > 18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down
> >      > QueryLoggerDisruptor..
> >      >
> >      > Exception in thread "main" java.sql.SQLException: ERROR 726
> >      > (43M10):Inconsistent namespace mapping properties. Cannot initiate
> >      > connection as SYSTEM:CATALOG is found but client does not have
> >      > phoenix.schema.isNamespaceMappingEnabled enabled
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
> >      >
> >      > at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
> >      >
> >      > at
> >     org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
> >      >
> >      > at java.sql.DriverManager.getConnection(DriverManager.java:664)
> >      >
> >      > at java.sql.DriverManager.getConnection(DriverManager.java:208)
> >      >
> >      > at
> >     org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
> >      >
> >      > at
> >     org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
> >      >
> >      > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> >      >
> >      > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
> >      >
> >      > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >      >
> >      > at
> >      >
> >
>  sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >      >
> >      > at
> >      >
> >
>  sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >      >
> >      > at java.lang.reflect.Method.invoke(Method.java:498)
> >      >
> >      > at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
> >      >
> >      > at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
> >      >
> >      > 18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session:
> >     0x3672eebffa800c8
> >      > closed
> >      >
> >      > 18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
> >      >
> >      >   * As well as the recommended approach in the HBase reference
> guide
> >      >     linked in the Phoenix docs:
> >      >
> >      > HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar
> >      > /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
> >      > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE
> --input
> >      > /ingest/MYCSV -z zk1,zk2,zk3 -g
> >      >
> >      >
> >      > Exception in thread "main" java.lang.NoClassDefFoundError:
> >      > org/apache/commons/cli/DefaultParser
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.mapreduce.AbstractBulkLoadTool.parseOptions(AbstractBulkLoadTool.java:128)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:176)
> >      >
> >      > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> >      >
> >      > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
> >      >
> >      > at
> >      >
> >
>  org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
> >      >
> >      > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >      >
> >      > at
> >      >
> >
>  sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >      >
> >      > at
> >      >
> >
>  sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >      >
> >      > at java.lang.reflect.Method.invoke(Method.java:498)
> >      >
> >      > at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
> >      >
> >      > at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
> >      >
> >      > Caused by: java.lang.ClassNotFoundException:
> >      > org.apache.commons.cli.DefaultParser
> >      >
> >      > at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> >      >
> >      > at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> >      >
> >      > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> >      >
> >      > at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> >      >
> >      > ... 11 more
> >      >
> >      >   * And finally, here is what the tables look like in both Hbase
> >     shell
> >      >     and sqlline:
> >      >
> >      > hbase shell
> >      >
> >      > HBase Shell
> >      >
> >      > Use "help" to get list of supported commands.
> >      >
> >      > Use "exit" to quit this interactive shell.
> >      >
> >      > Version 2.0.0.3.0.1.0-187,
> >     re9fcf450949102de5069b257a6dee469b8f5aab3,
> >      > Wed Sep 19 10:16:35 UTC 2018
> >      >
> >      > Took 0.0016 seconds
> >      >
> >      > hbase(main):001:0> list
> >      >
> >      > TABLE
> >      >
> >      > ATLAS_ENTITY_AUDIT_EVENTS
> >      >
> >      > MYTABLE
> >      >
> >      > SYSTEM:CATALOG
> >      >
> >      > SYSTEM:FUNCTION
> >      >
> >      > SYSTEM:LOG
> >      >
> >      > SYSTEM:MUTEX
> >      >
> >      > SYSTEM:SEQUENCE
> >      >
> >      > SYSTEM:STATS
> >      >
> >      > atlas_janus
> >      >
> >      > 9 row(s)
> >      >
> >      > Took 0.6114 seconds
> >      >
> >      > => ["ATLAS_ENTITY_AUDIT_EVENTS", "MYTABLE", "SYSTEM:CATALOG",
> >      > "SYSTEM:FUNCTION", "SYSTEM:LOG", "SYSTEM:MUTEX",
> "SYSTEM:SEQUENCE",
> >      > "SYSTEM:STATS", "atlas_janus"]
> >      >
> >      >
> >      >
> >      >
> >      >
> >      > phoenix-sqlline master-1.punch.datareservoir.net
> >     <http://master-1.punch.datareservoir.net>
> >      > <http://master-1.punch.datareservoir.net>
> >      >
> >      > *Setting property: [incremental, false]*
> >      >
> >      > *Setting property: [isolation, TRANSACTION_READ_COMMITTED]*
> >      >
> >      > *issuing: !connect jdbc:phoenix:mysrv none none
> >      > org.apache.phoenix.jdbc.PhoenixDriver*
> >      >
> >      > *Connecting to jdbc:phoenix:mysrv*
> >      >
> >      > SLF4J: Class path contains multiple SLF4J bindings.
> >      >
> >      > SLF4J: Found binding in
> >      >
> >
>  [jar:file:/usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >      >
> >      > SLF4J: Found binding in
> >      >
> >
>  [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >      >
> >      > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for
> an
> >      > explanation.
> >      >
> >      > 18/11/27 15:45:51 WARN util.NativeCodeLoader: Unable to load
> >      > native-hadoop library for your platform... using builtin-java
> >     classes
> >      > where applicable
> >      >
> >      > *Connected to: Phoenix (version 5.0)*
> >      >
> >      > *Driver: PhoenixEmbeddedDriver (version 5.0)*
> >      >
> >      > *Autocommit status: true*
> >      >
> >      > *Transaction isolation: TRANSACTION_READ_COMMITTED*
> >      >
> >      > Building list of tables and columns for tab-completion (set
> >     fastconnect
> >      > to true to skip)...
> >      >
> >      > 144/144 (100%) Done
> >      >
> >      > Done
> >      >
> >      > sqlline version 1.2.0
> >      >
> >      > 0: jdbc:phoenix:mysrv> !tables
> >      >
> >      >
> >
>  *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
> >      >
> >      > *| **TABLE_CAT**| **TABLE_SCHEM**| **TABLE_NAME**|**TABLE_TYPE **|
> >      > **REMARKS**| **TYPE_NAME**| **SELF_REFERENCING_COL_NAME**|
> >      > **REF_GENERATION**| **INDEX_STATE**| **IMMUTABLE_ROWS**|
> >      > **SALT_BUCKETS**| **MULTI_TENANT**| **VIEW_STATEMENT**|
> >     **VIEW_TYPE**|
> >      > **INDEX_T**|*
> >      >
> >      >
> >
>  *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
> >      >
> >      > *|**| *SYSTEM *| *CATALOG *| *SYSTEM TABLE*|**|**|**| **|**|
> >     *false *|
> >      > *null*| *false *| **|**| **|*
> >      >
> >      > *|**| *SYSTEM *| *FUNCTION*| *SYSTEM TABLE*|**|**|**| **|**|
> >     *false *|
> >      > *null*| *false *| **|**| **|*
> >      >
> >      > *|**| *SYSTEM *| *LOG *| *SYSTEM TABLE*|**|**|**| **|**| *true*|
> >     *32*|
> >      > *false *| **|**| **|*
> >      >
> >      > *|**| *SYSTEM *| *SEQUENCE*| *SYSTEM TABLE*|**|**|**| **|**|
> >     *false *|
> >      > *null*| *false *| **|**| **|*
> >      >
> >      > *|**| *SYSTEM *| *STATS *| *SYSTEM TABLE*|**|**|**| **|**| *false
> *|
> >      > *null*| *false *| **|**| **|*
> >      >
> >      > *|**|**| *MYTABLE*| *TABLE *|**|**|**| **|**| *false *| *5 *|
> >     *false *|
> >      > **|**| **|*
> >      >
> >      >
> >
>  *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
> >      >
> >
>

Re: client does not have phoenix.schema.isNamespaceMappingEnabled

Posted by Josh Elser <el...@apache.org>.
Why didn't it work?

The hbase-protocol.jar is insufficient to run MapReduce jobs against 
HBase; full stop. You're going to get lots of stuff pulled in via the 
phoenix-client.jar that you give to `hadoop jar`. That said, I can't 
think of a reason that including more jars on the classpath would be 
harmful.

Realistically, you might only need to provide HBASE_CONF_DIR to the 
HADOOP_CLASSPATH env variable, so that your mappers and reducers also 
get it on their classpath. The rest of the Java classes would be 
automatically localized via `hadoop jar`.

On 11/29/18 1:27 PM, M. Aaron Bossert wrote:
> So, sorry for the super late reply...there is weird lag between the time 
> a message is sent or received to this mailing list and when I actually 
> see it...But, I have got it working now as follows:
> 
> HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/3.0.1. 
> <http://3.0.0.1/>0-187/0/ hadoop jar ...
> 
> using this did not work:
> 
> HADOOP_CLASSPATH="$(hbase mapredcp)" hadoop jar ...
> 
> 
> the output of that command separately is this:
> 
> [user@server /somedir $] [mabossert@edge-3 lanl_data]$ hbase mapredcp
> 
> /usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-protobuf-2.1.0.jar:/usr/hdp/3.0.1.0-187/zookeeper/zookeeper-3.4.6.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/htrace-core4-4.2.0-incubating.jar:/usr/hdp/3.0.1.0-187/hbase/lib/commons-lang3-3.6.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-server-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol-shaded-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-hadoop2-compat-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-mapreduce-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-metrics-api-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/metrics-core-3.2.1.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-databind-2.9.5.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-client-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-hadoop-compat-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-netty-2.1.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-miscellaneous-2.1.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-metrics-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-common-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-zookeeper-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-annotations-2.9.5.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-core-2.9.5.jar
> 
> 
> On Tue, Nov 27, 2018 at 4:26 PM Josh Elser <elserj@apache.org 
> <ma...@apache.org>> wrote:
> 
>     To add a non-jar file to the classpath of a Java application, you must
>     add the directory containing that file to the classpath.
> 
>     Thus, the following is wrong:
>     HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/3.0.1.
>     <http://3.0.1.>0-187/0/hbase-site.xml
> 
>     And should be:
>     HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/3.0.1.
>     <http://3.0.1.>0-187/0/
> 
>     Most times, including the output of `hbase mapredcp` is sufficient ala
> 
>     HADOOP_CLASSPATH="$(hbase mapredcp)" hadoop jar ...
> 
>     On 11/27/18 10:48 AM, M. Aaron Bossert wrote:
>      > Folks,
>      >
>      > I have, I believe, followed all the directions for turning on
>     namespace
>      > mapping as well as extra steps to (added classpath) required to
>     use the
>      > mapreduce bulk load utility, but am still running into this
>     error...I am
>      > running a Hortonworks cluster with both HDP v 3.0.1 and HDF
>     components.
>      > Here is what I have tried:
>      >
>      >   * Checked that the proper hbase-site.xml (in my case:
>      >     /etc/hbase/3.0.1.0-187/0/hbase-site.xml) file is being referenced
>      >     when launching the mapreduce utility:
>      >
>      >
>      >      ...
>      >
>      >
>      > <property>
>      >
>      > <name>phoenix.schema.isNamespaceMappingEnabled</name>
>      >
>      > <value>true</value>
>      >
>      > </property>
>      >
>      > <property>
>      >
>      > <name>phoenix.schema.mapSystemTablesToNamespace</name>
>      >
>      > <value>true</value>
>      >
>      > </property>
>      >
>      >
>      >      ...
>      >
>      >   * added the appropriate classpath additions to the hadoop jar
>     command
>      >     (zookeeper quorum hostnames changed to remove my corporate
>     network
>      >     info as well as data directory):
>      >
>      >
>     HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/3.0.1.
>     <http://3.0.1.>0-187/0/hbase-site.xml
>      > hadoop jar
>      > /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
>      > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
>      > /ingest/MYCSV -z zk1,zk2,zk3 -g
>      >
>      >
>      > ...
>      >
>      >
>      > 18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper
>      > connection 0x1d58d65f to master-1.punch.datareservoir.net:2181
>     <http://master-1.punch.datareservoir.net:2181>
>      >
>     <http://master-1.punch.datareservoir.net:2181>,master-2.punch.datareservoir.net:2181
>     <http://master-2.punch.datareservoir.net:2181>
>      >
>     <http://master-2.punch.datareservoir.net:2181>,master-3.punch.datareservoir.net:2181
>     <http://master-3.punch.datareservoir.net:2181>
>      > <http://master-3.punch.datareservoir.net:2181>
>      >
>      > 18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down
>      > QueryLoggerDisruptor..
>      >
>      > Exception in thread "main" java.sql.SQLException: ERROR 726
>      > (43M10):Inconsistent namespace mapping properties. Cannot initiate
>      > connection as SYSTEM:CATALOG is found but client does not have
>      > phoenix.schema.isNamespaceMappingEnabled enabled
>      >
>      > at
>      >
>     org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
>      >
>      > at
>      >
>     org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
>      >
>      > at
>      >
>     org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)
>      >
>      > at
>      >
>     org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
>      >
>      > at
>      >
>     org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)
>      >
>      > at
>      >
>     org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
>      >
>      > at
>      >
>     org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
>      >
>      > at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
>      >
>      > at
>      >
>     org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
>      >
>      > at
>      >
>     org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
>      >
>      > at
>      >
>     org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
>      >
>      > at
>      >
>     org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
>      >
>      > at
>     org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
>      >
>      > at java.sql.DriverManager.getConnection(DriverManager.java:664)
>      >
>      > at java.sql.DriverManager.getConnection(DriverManager.java:208)
>      >
>      > at
>     org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
>      >
>      > at
>     org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
>      >
>      > at
>      >
>     org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
>      >
>      > at
>      >
>     org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
>      >
>      > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>      >
>      > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>      >
>      > at
>      >
>     org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
>      >
>      > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>      >
>      > at
>      >
>     sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>      >
>      > at
>      >
>     sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>      >
>      > at java.lang.reflect.Method.invoke(Method.java:498)
>      >
>      > at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>      >
>      > at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>      >
>      > 18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session:
>     0x3672eebffa800c8
>      > closed
>      >
>      > 18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
>      >
>      >   * Also tried the other recommended option:
>      >
>      > HADOOP_CLASSPATH=$(hbase
>      > mapredcp):/etc/hbase/3.0.1.0-187/0/hbase-site.xml hadoop jar
>      > /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
>      > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
>      > /ingest/MYCSV -z zk1,zk2,zk3 -g
>      >
>      >
>      > ...
>      >
>      >
>      > 18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper
>      > connection 0x1d58d65f to master-1.punch.datareservoir.net:2181
>     <http://master-1.punch.datareservoir.net:2181>
>      >
>     <http://master-1.punch.datareservoir.net:2181>,master-2.punch.datareservoir.net:2181
>     <http://master-2.punch.datareservoir.net:2181>
>      >
>     <http://master-2.punch.datareservoir.net:2181>,master-3.punch.datareservoir.net:2181
>     <http://master-3.punch.datareservoir.net:2181>
>      > <http://master-3.punch.datareservoir.net:2181>
>      >
>      > 18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down
>      > QueryLoggerDisruptor..
>      >
>      > Exception in thread "main" java.sql.SQLException: ERROR 726
>      > (43M10):Inconsistent namespace mapping properties. Cannot initiate
>      > connection as SYSTEM:CATALOG is found but client does not have
>      > phoenix.schema.isNamespaceMappingEnabled enabled
>      >
>      > at
>      >
>     org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
>      >
>      > at
>      >
>     org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
>      >
>      > at
>      >
>     org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)
>      >
>      > at
>      >
>     org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
>      >
>      > at
>      >
>     org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)
>      >
>      > at
>      >
>     org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
>      >
>      > at
>      >
>     org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
>      >
>      > at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
>      >
>      > at
>      >
>     org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
>      >
>      > at
>      >
>     org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
>      >
>      > at
>      >
>     org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
>      >
>      > at
>      >
>     org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
>      >
>      > at
>      >
>     org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
>      >
>      > at
>     org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
>      >
>      > at java.sql.DriverManager.getConnection(DriverManager.java:664)
>      >
>      > at java.sql.DriverManager.getConnection(DriverManager.java:208)
>      >
>      > at
>     org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
>      >
>      > at
>     org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
>      >
>      > at
>      >
>     org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
>      >
>      > at
>      >
>     org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
>      >
>      > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>      >
>      > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>      >
>      > at
>      >
>     org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
>      >
>      > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>      >
>      > at
>      >
>     sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>      >
>      > at
>      >
>     sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>      >
>      > at java.lang.reflect.Method.invoke(Method.java:498)
>      >
>      > at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>      >
>      > at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>      >
>      > 18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session:
>     0x3672eebffa800c8
>      > closed
>      >
>      > 18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
>      >
>      >   * As well as the recommended approach in the HBase reference guide
>      >     linked in the Phoenix docs:
>      >
>      > HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar
>      > /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
>      > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
>      > /ingest/MYCSV -z zk1,zk2,zk3 -g
>      >
>      >
>      > Exception in thread "main" java.lang.NoClassDefFoundError:
>      > org/apache/commons/cli/DefaultParser
>      >
>      > at
>      >
>     org.apache.phoenix.mapreduce.AbstractBulkLoadTool.parseOptions(AbstractBulkLoadTool.java:128)
>      >
>      > at
>      >
>     org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:176)
>      >
>      > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>      >
>      > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>      >
>      > at
>      >
>     org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
>      >
>      > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>      >
>      > at
>      >
>     sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>      >
>      > at
>      >
>     sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>      >
>      > at java.lang.reflect.Method.invoke(Method.java:498)
>      >
>      > at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>      >
>      > at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>      >
>      > Caused by: java.lang.ClassNotFoundException:
>      > org.apache.commons.cli.DefaultParser
>      >
>      > at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>      >
>      > at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>      >
>      > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>      >
>      > at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>      >
>      > ... 11 more
>      >
>      >   * And finally, here is what the tables look like in both Hbase
>     shell
>      >     and sqlline:
>      >
>      > hbase shell
>      >
>      > HBase Shell
>      >
>      > Use "help" to get list of supported commands.
>      >
>      > Use "exit" to quit this interactive shell.
>      >
>      > Version 2.0.0.3.0.1.0-187,
>     re9fcf450949102de5069b257a6dee469b8f5aab3,
>      > Wed Sep 19 10:16:35 UTC 2018
>      >
>      > Took 0.0016 seconds
>      >
>      > hbase(main):001:0> list
>      >
>      > TABLE
>      >
>      > ATLAS_ENTITY_AUDIT_EVENTS
>      >
>      > MYTABLE
>      >
>      > SYSTEM:CATALOG
>      >
>      > SYSTEM:FUNCTION
>      >
>      > SYSTEM:LOG
>      >
>      > SYSTEM:MUTEX
>      >
>      > SYSTEM:SEQUENCE
>      >
>      > SYSTEM:STATS
>      >
>      > atlas_janus
>      >
>      > 9 row(s)
>      >
>      > Took 0.6114 seconds
>      >
>      > => ["ATLAS_ENTITY_AUDIT_EVENTS", "MYTABLE", "SYSTEM:CATALOG",
>      > "SYSTEM:FUNCTION", "SYSTEM:LOG", "SYSTEM:MUTEX", "SYSTEM:SEQUENCE",
>      > "SYSTEM:STATS", "atlas_janus"]
>      >
>      >
>      >
>      >
>      >
>      > phoenix-sqlline master-1.punch.datareservoir.net
>     <http://master-1.punch.datareservoir.net>
>      > <http://master-1.punch.datareservoir.net>
>      >
>      > *Setting property: [incremental, false]*
>      >
>      > *Setting property: [isolation, TRANSACTION_READ_COMMITTED]*
>      >
>      > *issuing: !connect jdbc:phoenix:mysrv none none
>      > org.apache.phoenix.jdbc.PhoenixDriver*
>      >
>      > *Connecting to jdbc:phoenix:mysrv*
>      >
>      > SLF4J: Class path contains multiple SLF4J bindings.
>      >
>      > SLF4J: Found binding in
>      >
>     [jar:file:/usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>      >
>      > SLF4J: Found binding in
>      >
>     [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>      >
>      > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>      > explanation.
>      >
>      > 18/11/27 15:45:51 WARN util.NativeCodeLoader: Unable to load
>      > native-hadoop library for your platform... using builtin-java
>     classes
>      > where applicable
>      >
>      > *Connected to: Phoenix (version 5.0)*
>      >
>      > *Driver: PhoenixEmbeddedDriver (version 5.0)*
>      >
>      > *Autocommit status: true*
>      >
>      > *Transaction isolation: TRANSACTION_READ_COMMITTED*
>      >
>      > Building list of tables and columns for tab-completion (set
>     fastconnect
>      > to true to skip)...
>      >
>      > 144/144 (100%) Done
>      >
>      > Done
>      >
>      > sqlline version 1.2.0
>      >
>      > 0: jdbc:phoenix:mysrv> !tables
>      >
>      >
>     *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
>      >
>      > *| **TABLE_CAT**| **TABLE_SCHEM**| **TABLE_NAME**|**TABLE_TYPE **|
>      > **REMARKS**| **TYPE_NAME**| **SELF_REFERENCING_COL_NAME**|
>      > **REF_GENERATION**| **INDEX_STATE**| **IMMUTABLE_ROWS**|
>      > **SALT_BUCKETS**| **MULTI_TENANT**| **VIEW_STATEMENT**|
>     **VIEW_TYPE**|
>      > **INDEX_T**|*
>      >
>      >
>     *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
>      >
>      > *|**| *SYSTEM *| *CATALOG *| *SYSTEM TABLE*|**|**|**| **|**|
>     *false *|
>      > *null*| *false *| **|**| **|*
>      >
>      > *|**| *SYSTEM *| *FUNCTION*| *SYSTEM TABLE*|**|**|**| **|**|
>     *false *|
>      > *null*| *false *| **|**| **|*
>      >
>      > *|**| *SYSTEM *| *LOG *| *SYSTEM TABLE*|**|**|**| **|**| *true*|
>     *32*|
>      > *false *| **|**| **|*
>      >
>      > *|**| *SYSTEM *| *SEQUENCE*| *SYSTEM TABLE*|**|**|**| **|**|
>     *false *|
>      > *null*| *false *| **|**| **|*
>      >
>      > *|**| *SYSTEM *| *STATS *| *SYSTEM TABLE*|**|**|**| **|**| *false *|
>      > *null*| *false *| **|**| **|*
>      >
>      > *|**|**| *MYTABLE*| *TABLE *|**|**|**| **|**| *false *| *5 *|
>     *false *|
>      > **|**| **|*
>      >
>      >
>     *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
>      >
> 

Re: client does not have phoenix.schema.isNamespaceMappingEnabled

Posted by "M. Aaron Bossert" <ma...@gmail.com>.
So, sorry for the super late reply...there is weird lag between the time a
message is sent or received to this mailing list and when I actually see
it...But, I have got it working now as follows:

HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/
3.0.1. <http://3.0.0.1/>0-187/0/ hadoop jar ...

using this did not work:

HADOOP_CLASSPATH="$(hbase mapredcp)" hadoop jar ...


the output of that command separately is this:

[user@server /somedir $] [mabossert@edge-3 lanl_data]$ hbase mapredcp

/usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-protobuf-2.1.0.jar:/usr/hdp/3.0.1.0-187/zookeeper/zookeeper-3.4.6.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/htrace-core4-4.2.0-incubating.jar:/usr/hdp/3.0.1.0-187/hbase/lib/commons-lang3-3.6.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-server-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol-shaded-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-hadoop2-compat-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-mapreduce-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-metrics-api-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/metrics-core-3.2.1.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-databind-2.9.5.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-client-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-hadoop-compat-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-netty-2.1.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-miscellaneous-2.1.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-metrics-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-common-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-zookeeper-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-annotations-2.9.5.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-core-2.9.5.jar

On Tue, Nov 27, 2018 at 4:26 PM Josh Elser <el...@apache.org> wrote:

> To add a non-jar file to the classpath of a Java application, you must
> add the directory containing that file to the classpath.
>
> Thus, the following is wrong:
>
> HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/
> 3.0.1.0-187/0/hbase-site.xml
>
> And should be:
>
> HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/
> 3.0.1.0-187/0/
>
> Most times, including the output of `hbase mapredcp` is sufficient ala
>
> HADOOP_CLASSPATH="$(hbase mapredcp)" hadoop jar ...
>
> On 11/27/18 10:48 AM, M. Aaron Bossert wrote:
> > Folks,
> >
> > I have, I believe, followed all the directions for turning on namespace
> > mapping as well as extra steps to (added classpath) required to use the
> > mapreduce bulk load utility, but am still running into this error...I am
> > running a Hortonworks cluster with both HDP v 3.0.1 and HDF components.
> > Here is what I have tried:
> >
> >   * Checked that the proper hbase-site.xml (in my case:
> >     /etc/hbase/3.0.1.0-187/0/hbase-site.xml) file is being referenced
> >     when launching the mapreduce utility:
> >
> >
> >      ...
> >
> >
> > <property>
> >
> > <name>phoenix.schema.isNamespaceMappingEnabled</name>
> >
> > <value>true</value>
> >
> > </property>
> >
> > <property>
> >
> > <name>phoenix.schema.mapSystemTablesToNamespace</name>
> >
> > <value>true</value>
> >
> > </property>
> >
> >
> >      ...
> >
> >   * added the appropriate classpath additions to the hadoop jar command
> >     (zookeeper quorum hostnames changed to remove my corporate network
> >     info as well as data directory):
> >
> >
> HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/
> 3.0.1.0-187/0/hbase-site.xml
> > hadoop jar
> > /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
> > /ingest/MYCSV -z zk1,zk2,zk3 -g
> >
> >
> > ...
> >
> >
> > 18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper
> > connection 0x1d58d65f to master-1.punch.datareservoir.net:2181
> > <http://master-1.punch.datareservoir.net:2181>,
> master-2.punch.datareservoir.net:2181
> > <http://master-2.punch.datareservoir.net:2181>,
> master-3.punch.datareservoir.net:2181
> > <http://master-3.punch.datareservoir.net:2181>
> >
> > 18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down
> > QueryLoggerDisruptor..
> >
> > Exception in thread "main" java.sql.SQLException: ERROR 726
> > (43M10):Inconsistent namespace mapping properties. Cannot initiate
> > connection as SYSTEM:CATALOG is found but client does not have
> > phoenix.schema.isNamespaceMappingEnabled enabled
> >
> > at
> >
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
> >
> > at
> >
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
> >
> > at
> >
> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)
> >
> > at
> >
> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
> >
> > at
> >
> org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
> >
> > at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
> >
> > at
> >
> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
> >
> > at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
> >
> > at java.sql.DriverManager.getConnection(DriverManager.java:664)
> >
> > at java.sql.DriverManager.getConnection(DriverManager.java:208)
> >
> > at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
> >
> > at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
> >
> > at
> >
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
> >
> > at
> >
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
> >
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> >
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
> >
> > at
> >
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >
> > at java.lang.reflect.Method.invoke(Method.java:498)
> >
> > at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
> >
> > at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
> >
> > 18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session: 0x3672eebffa800c8
> > closed
> >
> > 18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
> >
> >   * Also tried the other recommended option:
> >
> > HADOOP_CLASSPATH=$(hbase
> > mapredcp):/etc/hbase/3.0.1.0-187/0/hbase-site.xml hadoop jar
> > /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
> > /ingest/MYCSV -z zk1,zk2,zk3 -g
> >
> >
> > ...
> >
> >
> > 18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper
> > connection 0x1d58d65f to master-1.punch.datareservoir.net:2181
> > <http://master-1.punch.datareservoir.net:2181>,
> master-2.punch.datareservoir.net:2181
> > <http://master-2.punch.datareservoir.net:2181>,
> master-3.punch.datareservoir.net:2181
> > <http://master-3.punch.datareservoir.net:2181>
> >
> > 18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down
> > QueryLoggerDisruptor..
> >
> > Exception in thread "main" java.sql.SQLException: ERROR 726
> > (43M10):Inconsistent namespace mapping properties. Cannot initiate
> > connection as SYSTEM:CATALOG is found but client does not have
> > phoenix.schema.isNamespaceMappingEnabled enabled
> >
> > at
> >
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
> >
> > at
> >
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
> >
> > at
> >
> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)
> >
> > at
> >
> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
> >
> > at
> >
> org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
> >
> > at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
> >
> > at
> >
> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
> >
> > at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
> >
> > at java.sql.DriverManager.getConnection(DriverManager.java:664)
> >
> > at java.sql.DriverManager.getConnection(DriverManager.java:208)
> >
> > at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
> >
> > at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
> >
> > at
> >
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
> >
> > at
> >
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
> >
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> >
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
> >
> > at
> >
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >
> > at java.lang.reflect.Method.invoke(Method.java:498)
> >
> > at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
> >
> > at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
> >
> > 18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session: 0x3672eebffa800c8
> > closed
> >
> > 18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
> >
> >   * As well as the recommended approach in the HBase reference guide
> >     linked in the Phoenix docs:
> >
> > HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar
> > /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
> > /ingest/MYCSV -z zk1,zk2,zk3 -g
> >
> >
> > Exception in thread "main" java.lang.NoClassDefFoundError:
> > org/apache/commons/cli/DefaultParser
> >
> > at
> >
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.parseOptions(AbstractBulkLoadTool.java:128)
> >
> > at
> >
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:176)
> >
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> >
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
> >
> > at
> >
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >
> > at java.lang.reflect.Method.invoke(Method.java:498)
> >
> > at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
> >
> > at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
> >
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.commons.cli.DefaultParser
> >
> > at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> >
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> >
> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> >
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> >
> > ... 11 more
> >
> >   * And finally, here is what the tables look like in both Hbase shell
> >     and sqlline:
> >
> > hbase shell
> >
> > HBase Shell
> >
> > Use "help" to get list of supported commands.
> >
> > Use "exit" to quit this interactive shell.
> >
> > Version 2.0.0.3.0.1.0-187, re9fcf450949102de5069b257a6dee469b8f5aab3,
> > Wed Sep 19 10:16:35 UTC 2018
> >
> > Took 0.0016 seconds
> >
> > hbase(main):001:0> list
> >
> > TABLE
> >
> > ATLAS_ENTITY_AUDIT_EVENTS
> >
> > MYTABLE
> >
> > SYSTEM:CATALOG
> >
> > SYSTEM:FUNCTION
> >
> > SYSTEM:LOG
> >
> > SYSTEM:MUTEX
> >
> > SYSTEM:SEQUENCE
> >
> > SYSTEM:STATS
> >
> > atlas_janus
> >
> > 9 row(s)
> >
> > Took 0.6114 seconds
> >
> > => ["ATLAS_ENTITY_AUDIT_EVENTS", "MYTABLE", "SYSTEM:CATALOG",
> > "SYSTEM:FUNCTION", "SYSTEM:LOG", "SYSTEM:MUTEX", "SYSTEM:SEQUENCE",
> > "SYSTEM:STATS", "atlas_janus"]
> >
> >
> >
> >
> >
> > phoenix-sqlline master-1.punch.datareservoir.net
> > <http://master-1.punch.datareservoir.net>
> >
> > *Setting property: [incremental, false]*
> >
> > *Setting property: [isolation, TRANSACTION_READ_COMMITTED]*
> >
> > *issuing: !connect jdbc:phoenix:mysrv none none
> > org.apache.phoenix.jdbc.PhoenixDriver*
> >
> > *Connecting to jdbc:phoenix:mysrv*
> >
> > SLF4J: Class path contains multiple SLF4J bindings.
> >
> > SLF4J: Found binding in
> >
> [jar:file:/usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >
> > SLF4J: Found binding in
> >
> [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> > explanation.
> >
> > 18/11/27 15:45:51 WARN util.NativeCodeLoader: Unable to load
> > native-hadoop library for your platform... using builtin-java classes
> > where applicable
> >
> > *Connected to: Phoenix (version 5.0)*
> >
> > *Driver: PhoenixEmbeddedDriver (version 5.0)*
> >
> > *Autocommit status: true*
> >
> > *Transaction isolation: TRANSACTION_READ_COMMITTED*
> >
> > Building list of tables and columns for tab-completion (set fastconnect
> > to true to skip)...
> >
> > 144/144 (100%) Done
> >
> > Done
> >
> > sqlline version 1.2.0
> >
> > 0: jdbc:phoenix:mysrv> !tables
> >
> >
> *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
> >
> > *| **TABLE_CAT**| **TABLE_SCHEM**| **TABLE_NAME**|**TABLE_TYPE **|
> > **REMARKS**| **TYPE_NAME**| **SELF_REFERENCING_COL_NAME**|
> > **REF_GENERATION**| **INDEX_STATE**| **IMMUTABLE_ROWS**|
> > **SALT_BUCKETS**| **MULTI_TENANT**| **VIEW_STATEMENT**| **VIEW_TYPE**|
> > **INDEX_T**|*
> >
> >
> *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
> >
> > *|**| *SYSTEM *| *CATALOG *| *SYSTEM TABLE*|**|**|**| **|**| *false *|
> > *null*| *false *| **|**| **|*
> >
> > *|**| *SYSTEM *| *FUNCTION*| *SYSTEM TABLE*|**|**|**| **|**| *false *|
> > *null*| *false *| **|**| **|*
> >
> > *|**| *SYSTEM *| *LOG *| *SYSTEM TABLE*|**|**|**| **|**| *true*| *32*|
> > *false *| **|**| **|*
> >
> > *|**| *SYSTEM *| *SEQUENCE*| *SYSTEM TABLE*|**|**|**| **|**| *false *|
> > *null*| *false *| **|**| **|*
> >
> > *|**| *SYSTEM *| *STATS *| *SYSTEM TABLE*|**|**|**| **|**| *false *|
> > *null*| *false *| **|**| **|*
> >
> > *|**|**| *MYTABLE*| *TABLE *|**|**|**| **|**| *false *| *5 *| *false *|
> > **|**| **|*
> >
> >
> *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
> >
>

Re: client does not have phoenix.schema.isNamespaceMappingEnabled

Posted by Josh Elser <el...@apache.org>.
To add a non-jar file to the classpath of a Java application, you must 
add the directory containing that file to the classpath.

Thus, the following is wrong: 
HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/3.0.1.0-187/0/hbase-site.xml

And should be: 
HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/3.0.1.0-187/0/

Most times, including the output of `hbase mapredcp` is sufficient ala

HADOOP_CLASSPATH="$(hbase mapredcp)" hadoop jar ...

On 11/27/18 10:48 AM, M. Aaron Bossert wrote:
> Folks,
> 
> I have, I believe, followed all the directions for turning on namespace 
> mapping as well as extra steps to (added classpath) required to use the 
> mapreduce bulk load utility, but am still running into this error...I am 
> running a Hortonworks cluster with both HDP v 3.0.1 and HDF components.  
> Here is what I have tried:
> 
>   * Checked that the proper hbase-site.xml (in my case:
>     /etc/hbase/3.0.1.0-187/0/hbase-site.xml) file is being referenced
>     when launching the mapreduce utility:
> 
> 
>      ...
> 
> 
> <property>
> 
> <name>phoenix.schema.isNamespaceMappingEnabled</name>
> 
> <value>true</value>
> 
> </property>
> 
> <property>
> 
> <name>phoenix.schema.mapSystemTablesToNamespace</name>
> 
> <value>true</value>
> 
> </property>
> 
> 
>      ...
> 
>   * added the appropriate classpath additions to the hadoop jar command
>     (zookeeper quorum hostnames changed to remove my corporate network
>     info as well as data directory):
> 
> HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/3.0.1.0-187/0/hbase-site.xml 
> hadoop jar 
> /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar 
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input 
> /ingest/MYCSV -z zk1,zk2,zk3 -g
> 
> 
> ...
> 
> 
> 18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper 
> connection 0x1d58d65f to master-1.punch.datareservoir.net:2181 
> <http://master-1.punch.datareservoir.net:2181>,master-2.punch.datareservoir.net:2181 
> <http://master-2.punch.datareservoir.net:2181>,master-3.punch.datareservoir.net:2181 
> <http://master-3.punch.datareservoir.net:2181>
> 
> 18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down 
> QueryLoggerDisruptor..
> 
> Exception in thread "main" java.sql.SQLException: ERROR 726 
> (43M10):Inconsistent namespace mapping properties. Cannot initiate 
> connection as SYSTEM:CATALOG is found but client does not have 
> phoenix.schema.isNamespaceMappingEnabled enabled
> 
> at 
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
> 
> at 
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
> 
> at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)
> 
> at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
> 
> at 
> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)
> 
> at 
> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
> 
> at 
> org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
> 
> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
> 
> at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
> 
> at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
> 
> at 
> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
> 
> at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
> 
> at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
> 
> at java.sql.DriverManager.getConnection(DriverManager.java:664)
> 
> at java.sql.DriverManager.getConnection(DriverManager.java:208)
> 
> at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
> 
> at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
> 
> at 
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
> 
> at 
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
> 
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> 
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
> 
> at 
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
> 
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 
> at java.lang.reflect.Method.invoke(Method.java:498)
> 
> at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
> 
> at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
> 
> 18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session: 0x3672eebffa800c8 
> closed
> 
> 18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
> 
>   * Also tried the other recommended option:
> 
> HADOOP_CLASSPATH=$(hbase 
> mapredcp):/etc/hbase/3.0.1.0-187/0/hbase-site.xml hadoop jar 
> /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar 
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input 
> /ingest/MYCSV -z zk1,zk2,zk3 -g
> 
> 
> ...
> 
> 
> 18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper 
> connection 0x1d58d65f to master-1.punch.datareservoir.net:2181 
> <http://master-1.punch.datareservoir.net:2181>,master-2.punch.datareservoir.net:2181 
> <http://master-2.punch.datareservoir.net:2181>,master-3.punch.datareservoir.net:2181 
> <http://master-3.punch.datareservoir.net:2181>
> 
> 18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down 
> QueryLoggerDisruptor..
> 
> Exception in thread "main" java.sql.SQLException: ERROR 726 
> (43M10):Inconsistent namespace mapping properties. Cannot initiate 
> connection as SYSTEM:CATALOG is found but client does not have 
> phoenix.schema.isNamespaceMappingEnabled enabled
> 
> at 
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
> 
> at 
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
> 
> at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)
> 
> at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
> 
> at 
> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)
> 
> at 
> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
> 
> at 
> org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
> 
> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
> 
> at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
> 
> at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
> 
> at 
> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
> 
> at 
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
> 
> at 
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
> 
> at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
> 
> at java.sql.DriverManager.getConnection(DriverManager.java:664)
> 
> at java.sql.DriverManager.getConnection(DriverManager.java:208)
> 
> at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
> 
> at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
> 
> at 
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
> 
> at 
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
> 
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> 
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
> 
> at 
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
> 
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 
> at java.lang.reflect.Method.invoke(Method.java:498)
> 
> at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
> 
> at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
> 
> 18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session: 0x3672eebffa800c8 
> closed
> 
> 18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
> 
>   * As well as the recommended approach in the HBase reference guide
>     linked in the Phoenix docs:
> 
> HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar 
> /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar 
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input 
> /ingest/MYCSV -z zk1,zk2,zk3 -g
> 
> 
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> org/apache/commons/cli/DefaultParser
> 
> at 
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.parseOptions(AbstractBulkLoadTool.java:128)
> 
> at 
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:176)
> 
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> 
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
> 
> at 
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
> 
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 
> at java.lang.reflect.Method.invoke(Method.java:498)
> 
> at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
> 
> at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
> 
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.commons.cli.DefaultParser
> 
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> 
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> 
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> 
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> 
> ... 11 more
> 
>   * And finally, here is what the tables look like in both Hbase shell
>     and sqlline:
> 
> hbase shell
> 
> HBase Shell
> 
> Use "help" to get list of supported commands.
> 
> Use "exit" to quit this interactive shell.
> 
> Version 2.0.0.3.0.1.0-187, re9fcf450949102de5069b257a6dee469b8f5aab3, 
> Wed Sep 19 10:16:35 UTC 2018
> 
> Took 0.0016 seconds
> 
> hbase(main):001:0> list
> 
> TABLE
> 
> ATLAS_ENTITY_AUDIT_EVENTS
> 
> MYTABLE
> 
> SYSTEM:CATALOG
> 
> SYSTEM:FUNCTION
> 
> SYSTEM:LOG
> 
> SYSTEM:MUTEX
> 
> SYSTEM:SEQUENCE
> 
> SYSTEM:STATS
> 
> atlas_janus
> 
> 9 row(s)
> 
> Took 0.6114 seconds
> 
> => ["ATLAS_ENTITY_AUDIT_EVENTS", "MYTABLE", "SYSTEM:CATALOG", 
> "SYSTEM:FUNCTION", "SYSTEM:LOG", "SYSTEM:MUTEX", "SYSTEM:SEQUENCE", 
> "SYSTEM:STATS", "atlas_janus"]
> 
> 
> 
> 
> 
> phoenix-sqlline master-1.punch.datareservoir.net 
> <http://master-1.punch.datareservoir.net>
> 
> *Setting property: [incremental, false]*
> 
> *Setting property: [isolation, TRANSACTION_READ_COMMITTED]*
> 
> *issuing: !connect jdbc:phoenix:mysrv none none 
> org.apache.phoenix.jdbc.PhoenixDriver*
> 
> *Connecting to jdbc:phoenix:mysrv*
> 
> SLF4J: Class path contains multiple SLF4J bindings.
> 
> SLF4J: Found binding in 
> [jar:file:/usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> 
> SLF4J: Found binding in 
> [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> 
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
> explanation.
> 
> 18/11/27 15:45:51 WARN util.NativeCodeLoader: Unable to load 
> native-hadoop library for your platform... using builtin-java classes 
> where applicable
> 
> *Connected to: Phoenix (version 5.0)*
> 
> *Driver: PhoenixEmbeddedDriver (version 5.0)*
> 
> *Autocommit status: true*
> 
> *Transaction isolation: TRANSACTION_READ_COMMITTED*
> 
> Building list of tables and columns for tab-completion (set fastconnect 
> to true to skip)...
> 
> 144/144 (100%) Done
> 
> Done
> 
> sqlline version 1.2.0
> 
> 0: jdbc:phoenix:mysrv> !tables
> 
> *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
> 
> *| **TABLE_CAT**| **TABLE_SCHEM**| **TABLE_NAME**|**TABLE_TYPE **| 
> **REMARKS**| **TYPE_NAME**| **SELF_REFERENCING_COL_NAME**| 
> **REF_GENERATION**| **INDEX_STATE**| **IMMUTABLE_ROWS**| 
> **SALT_BUCKETS**| **MULTI_TENANT**| **VIEW_STATEMENT**| **VIEW_TYPE**| 
> **INDEX_T**|*
> 
> *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
> 
> *|**| *SYSTEM *| *CATALOG *| *SYSTEM TABLE*|**|**|**| **|**| *false *| 
> *null*| *false *| **|**| **|*
> 
> *|**| *SYSTEM *| *FUNCTION*| *SYSTEM TABLE*|**|**|**| **|**| *false *| 
> *null*| *false *| **|**| **|*
> 
> *|**| *SYSTEM *| *LOG *| *SYSTEM TABLE*|**|**|**| **|**| *true*| *32*| 
> *false *| **|**| **|*
> 
> *|**| *SYSTEM *| *SEQUENCE*| *SYSTEM TABLE*|**|**|**| **|**| *false *| 
> *null*| *false *| **|**| **|*
> 
> *|**| *SYSTEM *| *STATS *| *SYSTEM TABLE*|**|**|**| **|**| *false *| 
> *null*| *false *| **|**| **|*
> 
> *|**|**| *MYTABLE*| *TABLE *|**|**|**| **|**| *false *| *5 *| *false *| 
> **|**| **|*
> 
> *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
> 

Re: client does not have phoenix.schema.isNamespaceMappingEnabled

Posted by venkata subbarayudu <av...@gmail.com>.
Can you try by having below variable set before you initiate bulk load
utility


export HBASE_CONF_DIR=/etc/hbase/3.0.1.0-187/0/hbase-site.xml


On Tue 27 Nov, 2018, 9:18 PM M. Aaron Bossert <mabossert@gmail.com wrote:

> Folks,
>
> I have, I believe, followed all the directions for turning on namespace
> mapping as well as extra steps to (added classpath) required to use the
> mapreduce bulk load utility, but am still running into this error...I am
> running a Hortonworks cluster with both HDP v 3.0.1 and HDF components.
> Here is what I have tried:
>
>
>    - Checked that the proper hbase-site.xml (in my case:
>    /etc/hbase/3.0.1.0-187/0/hbase-site.xml) file is being referenced when
>    launching the mapreduce utility:
>
>
>     ...
>
>
>     <property>
>
>       <name>phoenix.schema.isNamespaceMappingEnabled</name>
>
>       <value>true</value>
>
>     </property>
>
>
>
>     <property>
>
>       <name>phoenix.schema.mapSystemTablesToNamespace</name>
>
>       <value>true</value>
>
>     </property>
>
>
>     ...
>
>    - added the appropriate classpath additions to the hadoop jar command
>    (zookeeper quorum hostnames changed to remove my corporate network info as
>    well as data directory):
>
> HADOOP_CLASSPATH=/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol.jar:/etc/hbase/3.0.1.0-187/0/hbase-site.xml
> hadoop jar
> /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
> /ingest/MYCSV -z zk1,zk2,zk3 -g
>
>
> ...
>
>
> 18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper
> connection 0x1d58d65f to master-1.punch.datareservoir.net:2181,
> master-2.punch.datareservoir.net:2181,
> master-3.punch.datareservoir.net:2181
>
> 18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down
> QueryLoggerDisruptor..
>
> Exception in thread "main" java.sql.SQLException: ERROR 726 (43M10):  Inconsistent
> namespace mapping properties. Cannot initiate connection as SYSTEM:CATALOG
> is found but client does not have phoenix.schema.isNamespaceMappingEnabled
> enabled
>
> at
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
>
> at
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
>
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)
>
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
>
> at
> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)
>
> at
> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
>
> at
> org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
>
> at
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
>
> at
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
>
> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
>
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
>
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
>
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
>
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
>
> at
> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
>
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
>
> at
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
>
> at
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
>
> at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
>
> at java.sql.DriverManager.getConnection(DriverManager.java:664)
>
> at java.sql.DriverManager.getConnection(DriverManager.java:208)
>
> at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
>
> at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
>
> at
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
>
> at
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>
> at
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:498)
>
> at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>
> at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>
> 18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session: 0x3672eebffa800c8
> closed
>
> 18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
>
>    - Also tried the other recommended option:
>
> HADOOP_CLASSPATH=$(hbase mapredcp):/etc/hbase/3.0.1.0-187/0/hbase-site.xml hadoop
> jar /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
> /ingest/MYCSV -z zk1,zk2,zk3 -g
>
>
> ...
>
>
> 18/11/27 15:31:48 INFO zookeeper.ReadOnlyZKClient: Close zookeeper
> connection 0x1d58d65f to master-1.punch.datareservoir.net:2181,
> master-2.punch.datareservoir.net:2181,
> master-3.punch.datareservoir.net:2181
>
> 18/11/27 15:31:48 INFO log.QueryLoggerDisruptor: Shutting down
> QueryLoggerDisruptor..
>
> Exception in thread "main" java.sql.SQLException: ERROR 726 (43M10):  Inconsistent
> namespace mapping properties. Cannot initiate connection as SYSTEM:CATALOG
> is found but client does not have phoenix.schema.isNamespaceMappingEnabled
> enabled
>
> at
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
>
> at
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
>
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1113)
>
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1501)
>
> at
> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2740)
>
> at
> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1114)
>
> at
> org.apache.phoenix.compile.CreateTableCompiler$1.execute(CreateTableCompiler.java:192)
>
> at
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:408)
>
> at
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
>
> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
>
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
>
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1806)
>
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2569)
>
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2532)
>
> at
> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
>
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2532)
>
> at
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
>
> at
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
>
> at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
>
> at java.sql.DriverManager.getConnection(DriverManager.java:664)
>
> at java.sql.DriverManager.getConnection(DriverManager.java:208)
>
> at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
>
> at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:392)
>
> at
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:206)
>
> at
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:180)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>
> at
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:498)
>
> at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>
> at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>
> 18/11/27 15:31:48 INFO zookeeper.ZooKeeper: Session: 0x3672eebffa800c8
> closed
>
> 18/11/27 15:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
>
>    - As well as the recommended approach in the HBase reference guide
>    linked in the Phoenix docs:
>
> HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar
> /usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MYTABLE --input
> /ingest/MYCSV -z zk1,zk2,zk3 -g
>
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/commons/cli/DefaultParser
>
> at
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.parseOptions(AbstractBulkLoadTool.java:128)
>
> at
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:176)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>
> at
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:109)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:498)
>
> at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>
> at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>
> Caused by: java.lang.ClassNotFoundException:
> org.apache.commons.cli.DefaultParser
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> ... 11 more
>
>    - And finally, here is what the tables look like in both Hbase shell
>    and sqlline:
>
> hbase shell
>
> HBase Shell
>
> Use "help" to get list of supported commands.
>
> Use "exit" to quit this interactive shell.
>
> Version 2.0.0.3.0.1.0-187, re9fcf450949102de5069b257a6dee469b8f5aab3, Wed
> Sep 19 10:16:35 UTC 2018
>
> Took 0.0016 seconds
>
>
>
>
> hbase(main):001:0> list
>
> TABLE
>
>
>
>
> ATLAS_ENTITY_AUDIT_EVENTS
>
>
>
>
> MYTABLE
>
>
>
>
> SYSTEM:CATALOG
>
>
>
>
> SYSTEM:FUNCTION
>
>
>
>
> SYSTEM:LOG
>
>
>
>
> SYSTEM:MUTEX
>
>
>
>
> SYSTEM:SEQUENCE
>
>
>
>
> SYSTEM:STATS
>
>
>
>
> atlas_janus
>
>
>
>
> 9 row(s)
>
> Took 0.6114 seconds
>
>
>
>
> => ["ATLAS_ENTITY_AUDIT_EVENTS", "MYTABLE", "SYSTEM:CATALOG",
> "SYSTEM:FUNCTION", "SYSTEM:LOG", "SYSTEM:MUTEX", "SYSTEM:SEQUENCE",
> "SYSTEM:STATS", "atlas_janus"]
>
>
>
>
>
> phoenix-sqlline master-1.punch.datareservoir.net
>
> *Setting property: [incremental, false]*
>
> *Setting property: [isolation, TRANSACTION_READ_COMMITTED]*
>
> *issuing: !connect jdbc:phoenix:mysrv none none
> org.apache.phoenix.jdbc.PhoenixDriver*
>
> *Connecting to jdbc:phoenix:mysrv*
>
> SLF4J: Class path contains multiple SLF4J bindings.
>
> SLF4J: Found binding in
> [jar:file:/usr/hdp/3.0.1.0-187/phoenix/phoenix-5.0.0.3.0.1.0-187-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in
> [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
>
> 18/11/27 15:45:51 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
>
> *Connected to: Phoenix (version 5.0)*
>
> *Driver: PhoenixEmbeddedDriver (version 5.0)*
>
> *Autocommit status: true*
>
> *Transaction isolation: TRANSACTION_READ_COMMITTED*
>
> Building list of tables and columns for tab-completion (set fastconnect to
> true to skip)...
>
> 144/144 (100%) Done
>
> Done
>
> sqlline version 1.2.0
>
> 0: jdbc:phoenix:mysrv> !tables
>
>
> *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
>
> *| **TABLE_CAT ** | **TABLE_SCHEM ** | **TABLE_NAME ** | ** TABLE_TYPE  **
> | **REMARKS ** | **TYPE_NAME ** | **SELF_REFERENCING_COL_NAME ** | *
> *REF_GENERATION ** | **INDEX_STATE ** | **IMMUTABLE_ROWS ** | *
> *SALT_BUCKETS ** | **MULTI_TENANT ** | **VIEW_STATEMENT ** | **VIEW_TYPE **
> | **INDEX_T** |*
>
>
> *+------------+--------------+-------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---------------+-----------------+------------+---------+*
>
> *| *          * | *SYSTEM      * | *CATALOG    * | *SYSTEM TABLE * | *
>      * | *          * | *                          * | *               *
> | *            * | *false          * | *null         * | *false        *
> | *               * | *          * | *       * |*
>
> *| *          * | *SYSTEM      * | *FUNCTION   * | *SYSTEM TABLE * | *
>      * | *          * | *                          * | *               *
> | *            * | *false          * | *null         * | *false        *
> | *               * | *          * | *       * |*
>
> *| *          * | *SYSTEM      * | *LOG        * | *SYSTEM TABLE * | *
>      * | *          * | *                          * | *               *
> | *            * | *true           * | *32           * | *false        *
> | *               * | *          * | *       * |*
>
> *| *          * | *SYSTEM      * | *SEQUENCE   * | *SYSTEM TABLE * | *
>      * | *          * | *                          * | *               *
> | *            * | *false          * | *null         * | *false        *
> | *               * | *          * | *       * |*
>
> *| *          * | *SYSTEM      * | *STATS      * | *SYSTEM TABLE * | *
>      * | *          * | *                          * | *               *
> | *            * | *false          * | *null         * | *false        *
> | *               * | *          * | *       * |*
>
> *| *          * | *            * | *MYTABLE   * | *TABLE        * | *
>    * | *          * | *                          <span class="gmail-s2"
> style="font-variant-
>