You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Mich Talebzadeh <mi...@gmail.com> on 2018/08/20 17:35:32 UTC

Phoenix CsvBulkLoadTool fails with java.sql.SQLException: ERROR 103 (08004): Unable to establish connection

This was working fine before my Hbase upgrade to 1.2.6

I have Hbase version 1.2.6 and Phoenix
version apache-phoenix-4.8.1-HBase-1.2-bin

This command bulkloading into Hbase through phoenix failsnow fails

HADOOP_CLASSPATH=${HOME}/jars/hbase-protocol-1.2.6.jar:${HBASE_HOME}/conf
hadoop jar ${HBASE_HOME}/lib/phoenix-4.8.1-HBase-1.2-client.jar
org.apache.phoenix.mapreduce.CsvBulkLoadTool --table ${TABLE_NAME} --input
hdfs://rhes75:9000/${REFINED_HBASE_SUB_DIR}/${FILE_NAME}_${dir}.txt

hadoop jar /data6/hduser/hbase-1.2.6/lib/phoenix-4.8.1-HBase-1.2-client.jar
org.apache.phoenix.mapreduce.CsvBulkLoadTool --table MARKETDATAHBASEBATCH
--input
hdfs://rhes75:9000//data/prices/2018-08-20_refined/populate_Phoenix_table_MARKETDATAHBASEBATCH_2018-08-20.txt
+
HADOOP_CLASSPATH=/home/hduser/jars/hbase-protocol-1.2.6.jar:/data6/hduser/hbase-1.2.6/conf


With the following error

2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:java.io.tmpdir=/tmp
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:java.compiler=<NA>
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:os.name=Linux
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:os.arch=amd64
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:os.version=3.10.0-862.3.2.el7.x86_64
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:user.name=hduser
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:user.home=/home/hduser
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:user.dir=/data6/hduser/streaming_data/2018-08-20
2018-08-20 18:29:47,249 INFO  [main] zookeeper.ZooKeeper: Initiating client
connection, connectString=rhes75:2181 sessionTimeout=90000
watcher=hconnection-0x493d44230x0, quorum=rhes75:2181, baseZNode=/hbase
2018-08-20 18:29:47,261 INFO  [main-SendThread(rhes75:2181)]
zookeeper.ClientCnxn: Opening socket connection to server rhes75/
50.140.197.220:2181. Will not attempt to authenticate using SASL (unknown
error)
2018-08-20 18:29:47,264 INFO  [main-SendThread(rhes75:2181)]
zookeeper.ClientCnxn: Socket connection established to rhes75/
50.140.197.220:2181, initiating session
2018-08-20 18:29:47,281 INFO  [main-SendThread(rhes75:2181)]
zookeeper.ClientCnxn:
Session establishment complete on server rhes75/50.140.197.220:2181,
sessionid = 0x1002ea99eed0077, negotiated timeout = 40000
Exception in thread "main" java.sql.SQLException: ERROR 103 (08004): Unable
to establish connection.
        at
org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:455)

Any thoughts?

Thanks

Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.

Re: Phoenix CsvBulkLoadTool fails with java.sql.SQLException: ERROR 103 (08004): Unable to establish connection

Posted by Josh Elser <el...@apache.org>.
Btw, this is covered in the HBase book:

http://hbase.apache.org/book.html#hadoop

The reality is that HBase 2.x will work with Hadoop 3. The "unsupported" 
tag is more expressing that it not ready for "production".

On 8/21/18 2:23 AM, Jaanai Zhang wrote:
> Caused by: java.lang.IllegalAccessError: class 
> org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its 
> superinterface 
> org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator
> 
> This is the root cause,  it seems that HBase 1.2 can't access interface 
> of Hadoop 3.1, so you should consider degrading Hadoop's version or 
> upgrading HBase's version.
> 
> 
> ----------------------------------------
>     Yun Zhang
>     Best regards!
> 
> 
> 2018-08-21 11:28 GMT+08:00 Mich Talebzadeh <mich.talebzadeh@gmail.com 
> <ma...@gmail.com>>:
> 
>     Hi,
> 
>     The Hadoop version is Hadoop 3.1.0. Hbase is 1.2.6 and Phoenix is
>     apache-phoenix-4.8.1-HBase-1.2-bin
> 
>     In the past I had issues with Hbase 2 working with Hadoop 3.1 so I
>     had to use Hbase 1.2.6. The individual components work fine. In
>     other words I can do all operations on Hbase with Hadoop 3.1 and
>     Phoenix.
> 
>     The issue I am facing is using both
>     org.apache.phoenix.mapreduce.CsvBulkLoadTool and
>     hbase.mapreduce.ImportTsv utilities.
> 
>     So I presume the issue may be to do with both these command line
>     tools not working with Hadoop 3.1?
> 
>     Thanks
> 
>     Dr Mich Talebzadeh
> 
>     LinkedIn
>     /https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>     <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>/
> 
>     http://talebzadehmich.wordpress.com
>     <http://talebzadehmich.wordpress.com>
> 
> 
>     *Disclaimer:* Use it at your own risk.Any and all responsibility for
>     any loss, damage or destruction of data or any other property which
>     may arise from relying on this email's technical content is
>     explicitly disclaimed. The author will in no case be liable for any
>     monetary damages arising from such loss, damage or destruction.
> 
> 
> 
>     On Tue, 21 Aug 2018 at 00:48, Sergey Soldatov
>     <sergey.soldatov@gmail.com <ma...@gmail.com>> wrote:
> 
>         If I read it correctly you are trying to use Phoenix and HBase
>         that were built against Hadoop 2 with Hadoop 3. Is HBase was the
>         only component you have upgraded?
> 
>         Thanks,
>         Sergey
> 
>         On Mon, Aug 20, 2018 at 1:42 PM Mich Talebzadeh
>         <mich.talebzadeh@gmail.com <ma...@gmail.com>>
>         wrote:
> 
>             Here you go
> 
>             2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper:
>             Client
>             environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
>             2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper:
>             Client environment:java.io.tmpdir=/tmp
>             2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper:
>             Client environment:java.compiler=<NA>
>             2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper:
>             Client environment:os.name <http://os.name>=Linux
>             2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper:
>             Client environment:os.arch=amd64
>             2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper:
>             Client environment:os.version=3.10.0-862.3.2.el7.x86_64
>             2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper:
>             Client environment:user.name <http://user.name>=hduser
>             2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper:
>             Client environment:user.home=/home/hduser
>             2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper:
>             Client
>             environment:user.dir=/data6/hduser/streaming_data/2018-08-20
>             2018-08-20 18:29:47,249 INFO  [main] zookeeper.ZooKeeper:
>             Initiating client connection, connectString=rhes75:2181
>             sessionTimeout=90000 watcher=hconnection-0x493d44230x0,
>             quorum=rhes75:2181, baseZNode=/hbase
>             2018-08-20 18:29:47,261 INFO  [main-SendThread(rhes75:2181)]
>             zookeeper.ClientCnxn: Opening socket connection to server
>             rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>.
>             Will not attempt to authenticate using SASL (unknown error)
>             2018-08-20 18:29:47,264 INFO  [main-SendThread(rhes75:2181)]
>             zookeeper.ClientCnxn: Socket connection established to
>             rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>,
>             initiating session
>             2018-08-20 18:29:47,281 INFO  [main-SendThread(rhes75:2181)]
>             zookeeper.ClientCnxn: Session establishment complete on
>             server rhes75/50.140.197.220:2181
>             <http://50.140.197.220:2181>, sessionid = 0x1002ea99eed0077,
>             negotiated timeout = 40000
>             Exception in thread "main" java.sql.SQLException: ERROR 103
>             (08004): Unable to establish connection.
>                      at
>             org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:455)
>                      at
>             org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
>                      at
>             org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:386)
>                      at
>             org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:222)
>                      at
>             org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2318)
>                      at
>             org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2294)
>                      at
>             org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
>                      at
>             org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2294)
>                      at
>             org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:232)
>                      at
>             org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:147)
>                      at
>             org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
>                      at
>             java.sql.DriverManager.getConnection(DriverManager.java:664)
>                      at
>             java.sql.DriverManager.getConnection(DriverManager.java:208)
>                      at
>             org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:340)
>                      at
>             org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:332)
>                      at
>             org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:209)
>                      at
>             org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:183)
>                      at
>             org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>                      at
>             org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>                      at
>             org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:101)
>                      at
>             sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>                      at
>             sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>                      at
>             sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>                      at java.lang.reflect.Method.invoke(Method.java:498)
>                      at org.apache.hadoop.util.RunJar.run(RunJar.java:308)
>                      at org.apache.hadoop.util.RunJar.main(RunJar.java:222)
>             Caused by: java.io.IOException:
>             java.lang.reflect.InvocationTargetException
>                      at
>             org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
>                      at
>             org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:431)
>                      at
>             org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:340)
>                      at
>             org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:144)
>                      at
>             org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
>                      at
>             org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:383)
>                      ... 23 more
>             Caused by: java.lang.reflect.InvocationTargetException
>                      at
>             sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>                      at
>             sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>                      at
>             sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>                      at
>             java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>                      at
>             org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
>                      ... 28 more
>             Caused by: java.lang.IllegalAccessError: class
>             org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its
>             superinterface
>             org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator
>                      at java.lang.ClassLoader.defineClass1(Native Method)
>                      at
>             java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>                      at
>             java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>                      at
>             java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>                      at
>             java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>                      at
>             java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>                      at
>             java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>                      at
>             java.security.AccessController.doPrivileged(Native Method)
>                      at
>             java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>                      at
>             java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>                      at
>             java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>                      at java.lang.Class.forName0(Native Method)
>                      at java.lang.Class.forName(Class.java:348)
>                      at
>             java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
>                      at
>             java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
>                      at
>             java.util.ServiceLoader$1.next(ServiceLoader.java:480)
>                      at
>             org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3268)
>                      at
>             org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3313)
>                      at
>             org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3352)
>                      at
>             org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
>                      at
>             org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3403)
>                      at
>             org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3371)
>                      at
>             org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477)
>                      at
>             org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
>                      at
>             org.apache.hadoop.hbase.util.DynamicClassLoader.initTempDir(DynamicClassLoader.java:120)
>                      at
>             org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:98)
>                      at
>             org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:242)
>                      at
>             org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
>                      at
>             org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
>                      at
>             org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105)
>                      at
>             org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:905)
>                      at
>             org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:648)
>                      ... 33 more
> 
>             Thanks
> 
>             Dr Mich Talebzadeh
> 
>             LinkedIn
>             /https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>             <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>/
> 
>             http://talebzadehmich.wordpress.com
>             <http://talebzadehmich.wordpress.com>
> 
> 
>             *Disclaimer:* Use it at your own risk.Any and all
>             responsibility for any loss, damage or destruction of data
>             or any other property which may arise from relying on this
>             email's technical content is explicitly disclaimed. The
>             author will in no case be liable for any monetary damages
>             arising from such loss, damage or destruction.
> 
> 
> 
>             On Mon, 20 Aug 2018 at 21:24, Josh Elser <elserj@apache.org
>             <ma...@apache.org>> wrote:
> 
>                 (-cc user@hbase, +bcc user@hbase)
> 
>                 How about the rest of the stacktrace? You didn't share
>                 the cause.
> 
>                 On 8/20/18 1:35 PM, Mich Talebzadeh wrote:
>                  >
>                  > This was working fine before my Hbase upgrade to 1.2.6
>                  >
>                  > I have Hbase version 1.2.6 and Phoenix
>                  > version apache-phoenix-4.8.1-HBase-1.2-bin
>                  >
>                  > This command bulkloading into Hbase through phoenix
>                 failsnow fails
>                  >
>                  >
>                 HADOOP_CLASSPATH=${HOME}/jars/hbase-protocol-1.2.6.jar:${HBASE_HOME}/conf
>                 hadoop
>                  > jar ${HBASE_HOME}/lib/phoenix-4.8.1-HBase-1.2-client.jar
>                  > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table
>                 ${TABLE_NAME}
>                  > --input
>                 hdfs://rhes75:9000/${REFINED_HBASE_SUB_DIR}/${FILE_NAME}_${dir}.txt
>                  >
>                  > hadoop jar
>                  >
>                 /data6/hduser/hbase-1.2.6/lib/phoenix-4.8.1-HBase-1.2-client.jar
> 
>                  > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table
>                  > MARKETDATAHBASEBATCH --input
>                  >
>                 hdfs://rhes75:9000//data/prices/2018-08-20_refined/populate_Phoenix_table_MARKETDATAHBASEBATCH_2018-08-20.txt
>                  > +
>                  >
>                 HADOOP_CLASSPATH=/home/hduser/jars/hbase-protocol-1.2.6.jar:/data6/hduser/hbase-1.2.6/conf
>                  >
>                  >
>                  > With the following error
>                  >
>                  > 2018-08-20 18:29:47,248 INFO  [main]
>                 zookeeper.ZooKeeper: Client
>                  >
>                 environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
>                  > 2018-08-20 18:29:47,248 INFO  [main]
>                 zookeeper.ZooKeeper: Client
>                  > environment:java.io.tmpdir=/tmp
>                  > 2018-08-20 18:29:47,248 INFO  [main]
>                 zookeeper.ZooKeeper: Client
>                  > environment:java.compiler=<NA>
>                  > 2018-08-20 18:29:47,248 INFO  [main]
>                 zookeeper.ZooKeeper: Client
>                  > environment:os.name <http://os.name>
>                 <http://os.name>=Linux
>                  > 2018-08-20 18:29:47,248 INFO  [main]
>                 zookeeper.ZooKeeper: Client
>                  > environment:os.arch=amd64
>                  > 2018-08-20 18:29:47,248 INFO  [main]
>                 zookeeper.ZooKeeper: Client
>                  > environment:os.version=3.10.0-862.3.2.el7.x86_64
>                  > 2018-08-20 18:29:47,248 INFO  [main]
>                 zookeeper.ZooKeeper: Client
>                  > environment:user.name <http://user.name>
>                 <http://user.name>=hduser
>                  > 2018-08-20 18:29:47,248 INFO  [main]
>                 zookeeper.ZooKeeper: Client
>                  > environment:user.home=/home/hduser
>                  > 2018-08-20 18:29:47,248 INFO  [main]
>                 zookeeper.ZooKeeper: Client
>                  >
>                 environment:user.dir=/data6/hduser/streaming_data/2018-08-20
>                  > 2018-08-20 18:29:47,249 INFO  [main]
>                 zookeeper.ZooKeeper: Initiating
>                  > client connection, connectString=rhes75:2181
>                 sessionTimeout=90000
>                  > watcher=hconnection-0x493d44230x0,
>                 quorum=rhes75:2181, baseZNode=/hbase
>                  > 2018-08-20 18:29:47,261 INFO 
>                 [main-SendThread(rhes75:2181)]
>                  > zookeeper.ClientCnxn: Opening socket connection to
>                 server
>                  > rhes75/50.140.197.220:2181
>                 <http://50.140.197.220:2181>
>                 <http://50.140.197.220:2181>. Will not
>                  > attempt to authenticate using SASL (unknown error)
>                  > 2018-08-20 18:29:47,264 INFO 
>                 [main-SendThread(rhes75:2181)]
>                  > zookeeper.ClientCnxn: Socket connection established to
>                  > rhes75/50.140.197.220:2181
>                 <http://50.140.197.220:2181>
>                 <http://50.140.197.220:2181>, initiating session
>                  > 2018-08-20 18:29:47,281 INFO 
>                 [main-SendThread(rhes75:2181)]
>                  > zookeeper.ClientCnxn: Session establishment complete
>                 on server
>                  > rhes75/50.140.197.220:2181
>                 <http://50.140.197.220:2181>
>                 <http://50.140.197.220:2181>, sessionid =
>                  > 0x1002ea99eed0077, negotiated timeout = 40000
>                  > Exception in thread "main" java.sql.SQLException:
>                 ERROR 103 (08004):
>                  > Unable to establish connection.
>                  >          at
>                  >
>                 org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:455)
>                  >
>                  > Any thoughts?
>                  >
>                  > Thanks
>                  >
>                  > Dr Mich Talebzadeh
>                  >
>                  > LinkedIn
>                  >
>                 /https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/
>                 <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/>
>                  >
>                  > http://talebzadehmich.wordpress.com
>                 <http://talebzadehmich.wordpress.com>
>                  >
>                  >
>                  > *Disclaimer:* Use it at your own risk.Any and all
>                 responsibility for any
>                  > loss, damage or destruction of data or any other
>                 property which may
>                  > arise from relying on this email's technical content
>                 is explicitly
>                  > disclaimed. The author will in no case be liable for
>                 any monetary
>                  > damages arising from such loss, damage or destruction.
>                  >
> 
> 

Re: Phoenix CsvBulkLoadTool fails with java.sql.SQLException: ERROR 103 (08004): Unable to establish connection

Posted by Jaanai Zhang <cl...@gmail.com>.
Caused by: java.lang.IllegalAccessError: class
org.apache.hadoop.hdfs.web.HftpFileSystem
cannot access its superinterface org.apache.hadoop.hdfs.web.TokenAspect$
TokenManagementDelegator

This is the root cause,  it seems that HBase 1.2 can't access
interface of Hadoop
3.1, so you should consider degrading  Hadoop's version or upgrading
HBase's version.


----------------------------------------
   Yun Zhang
   Best regards!


2018-08-21 11:28 GMT+08:00 Mich Talebzadeh <mi...@gmail.com>:

> Hi,
>
> The Hadoop version is Hadoop 3.1.0. Hbase is 1.2.6 and Phoenix is
> apache-phoenix-4.8.1-HBase-1.2-bin
>
> In the past I had issues with Hbase 2 working with Hadoop 3.1 so I had to
> use Hbase 1.2.6. The individual components work fine. In other words I can
> do all operations on Hbase with Hadoop 3.1 and Phoenix.
>
> The issue I am facing is using both  org.apache.phoenix.mapreduce.
> CsvBulkLoadTool and hbase.mapreduce.ImportTsv utilities.
>
> So I presume the issue may be to do with both these command line tools not
> working with Hadoop 3.1?
>
> Thanks
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Tue, 21 Aug 2018 at 00:48, Sergey Soldatov <se...@gmail.com>
> wrote:
>
>> If I read it correctly you are trying to use Phoenix and HBase that were
>> built against Hadoop 2 with Hadoop 3. Is HBase was the only component you
>> have upgraded?
>>
>> Thanks,
>> Sergey
>>
>> On Mon, Aug 20, 2018 at 1:42 PM Mich Talebzadeh <
>> mich.talebzadeh@gmail.com> wrote:
>>
>>> Here you go
>>>
>>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
>>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> environment:java.io.tmpdir=/tmp
>>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> environment:java.compiler=<NA>
>>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> environment:os.name=Linux
>>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> environment:os.arch=amd64
>>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> environment:os.version=3.10.0-862.3.2.el7.x86_64
>>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> environment:user.name=hduser
>>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> environment:user.home=/home/hduser
>>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> environment:user.dir=/data6/hduser/streaming_data/2018-08-20
>>> 2018-08-20 18:29:47,249 INFO  [main] zookeeper.ZooKeeper: Initiating
>>> client connection, connectString=rhes75:2181 sessionTimeout=90000
>>> watcher=hconnection-0x493d44230x0, quorum=rhes75:2181, baseZNode=/hbase
>>> 2018-08-20 18:29:47,261 INFO  [main-SendThread(rhes75:2181)]
>>> zookeeper.ClientCnxn: Opening socket connection to server rhes75/
>>> 50.140.197.220:2181. Will not attempt to authenticate using SASL
>>> (unknown error)
>>> 2018-08-20 18:29:47,264 INFO  [main-SendThread(rhes75:2181)]
>>> zookeeper.ClientCnxn: Socket connection established to rhes75/
>>> 50.140.197.220:2181, initiating session
>>> 2018-08-20 18:29:47,281 INFO  [main-SendThread(rhes75:2181)]
>>> zookeeper.ClientCnxn: Session establishment complete on server rhes75/
>>> 50.140.197.220:2181, sessionid = 0x1002ea99eed0077, negotiated timeout
>>> = 40000
>>> Exception in thread "main" java.sql.SQLException: ERROR 103 (08004):
>>> Unable to establish connection.
>>>         at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.
>>> newException(SQLExceptionCode.java:455)
>>>         at org.apache.phoenix.exception.SQLExceptionInfo.buildException(
>>> SQLExceptionInfo.java:145)
>>>         at org.apache.phoenix.query.ConnectionQueryServicesImpl.
>>> openConnection(ConnectionQueryServicesImpl.java:386)
>>>         at org.apache.phoenix.query.ConnectionQueryServicesImpl.
>>> access$300(ConnectionQueryServicesImpl.java:222)
>>>         at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(
>>> ConnectionQueryServicesImpl.java:2318)
>>>         at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(
>>> ConnectionQueryServicesImpl.java:2294)
>>>         at org.apache.phoenix.util.PhoenixContextExecutor.call(
>>> PhoenixContextExecutor.java:76)
>>>         at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(
>>> ConnectionQueryServicesImpl.java:2294)
>>>         at org.apache.phoenix.jdbc.PhoenixDriver.
>>> getConnectionQueryServices(PhoenixDriver.java:232)
>>>         at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.
>>> createConnection(PhoenixEmbeddedDriver.java:147)
>>>         at org.apache.phoenix.jdbc.PhoenixDriver.connect(
>>> PhoenixDriver.java:202)
>>>         at java.sql.DriverManager.getConnection(DriverManager.java:664)
>>>         at java.sql.DriverManager.getConnection(DriverManager.java:208)
>>>         at org.apache.phoenix.util.QueryUtil.getConnection(
>>> QueryUtil.java:340)
>>>         at org.apache.phoenix.util.QueryUtil.getConnection(
>>> QueryUtil.java:332)
>>>         at org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(
>>> AbstractBulkLoadTool.java:209)
>>>         at org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(
>>> AbstractBulkLoadTool.java:183)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>>>         at org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(
>>> CsvBulkLoadTool.java:101)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke(
>>> NativeMethodAccessorImpl.java:62)
>>>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
>>> DelegatingMethodAccessorImpl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:498)
>>>         at org.apache.hadoop.util.RunJar.run(RunJar.java:308)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:222)
>>> Caused by: java.io.IOException: java.lang.reflect.
>>> InvocationTargetException
>>>         at org.apache.hadoop.hbase.client.ConnectionFactory.
>>> createConnection(ConnectionFactory.java:240)
>>>         at org.apache.hadoop.hbase.client.ConnectionManager.
>>> createConnection(ConnectionManager.java:431)
>>>         at org.apache.hadoop.hbase.client.ConnectionManager.
>>> createConnectionInternal(ConnectionManager.java:340)
>>>         at org.apache.hadoop.hbase.client.HConnectionManager.
>>> createConnection(HConnectionManager.java:144)
>>>         at org.apache.phoenix.query.HConnectionFactory$
>>> HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
>>>         at org.apache.phoenix.query.ConnectionQueryServicesImpl.
>>> openConnection(ConnectionQueryServicesImpl.java:383)
>>>         ... 23 more
>>> Caused by: java.lang.reflect.InvocationTargetException
>>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method)
>>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(
>>> NativeConstructorAccessorImpl.java:62)
>>>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
>>> DelegatingConstructorAccessorImpl.java:45)
>>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:
>>> 423)
>>>         at org.apache.hadoop.hbase.client.ConnectionFactory.
>>> createConnection(ConnectionFactory.java:238)
>>>         ... 28 more
>>> Caused by: java.lang.IllegalAccessError: class
>>> org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its
>>> superinterface org.apache.hadoop.hdfs.web.TokenAspect$
>>> TokenManagementDelegator
>>>         at java.lang.ClassLoader.defineClass1(Native Method)
>>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>>         at java.security.SecureClassLoader.defineClass(
>>> SecureClassLoader.java:142)
>>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>>         at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>         at java.lang.Class.forName0(Native Method)
>>>         at java.lang.Class.forName(Class.java:348)
>>>         at java.util.ServiceLoader$LazyIterator.nextService(
>>> ServiceLoader.java:370)
>>>         at java.util.ServiceLoader$LazyIterator.next(
>>> ServiceLoader.java:404)
>>>         at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
>>>         at org.apache.hadoop.fs.FileSystem.loadFileSystems(
>>> FileSystem.java:3268)
>>>         at org.apache.hadoop.fs.FileSystem.getFileSystemClass(
>>> FileSystem.java:3313)
>>>         at org.apache.hadoop.fs.FileSystem.createFileSystem(
>>> FileSystem.java:3352)
>>>         at org.apache.hadoop.fs.FileSystem.access$200(
>>> FileSystem.java:124)
>>>         at org.apache.hadoop.fs.FileSystem$Cache.getInternal(
>>> FileSystem.java:3403)
>>>         at org.apache.hadoop.fs.FileSystem$Cache.get(
>>> FileSystem.java:3371)
>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477)
>>>         at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
>>>         at org.apache.hadoop.hbase.util.DynamicClassLoader.initTempDir(
>>> DynamicClassLoader.java:120)
>>>         at org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(
>>> DynamicClassLoader.java:98)
>>>         at org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>
>>> (ProtobufUtil.java:242)
>>>         at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.
>>> java:64)
>>>         at org.apache.hadoop.hbase.zookeeper.ZKClusterId.
>>> readClusterIdZNode(ZKClusterId.java:75)
>>>         at org.apache.hadoop.hbase.client.ZooKeeperRegistry.
>>> getClusterId(ZooKeeperRegistry.java:105)
>>>         at org.apache.hadoop.hbase.client.ConnectionManager$
>>> HConnectionImplementation.retrieveClusterId(ConnectionManager.java:905)
>>>         at org.apache.hadoop.hbase.client.ConnectionManager$
>>> HConnectionImplementation.<init>(ConnectionManager.java:648)
>>>         ... 33 more
>>>
>>> Thanks
>>>
>>> Dr Mich Talebzadeh
>>>
>>>
>>>
>>> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>
>>>
>>>
>>> http://talebzadehmich.wordpress.com
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>>
>>> On Mon, 20 Aug 2018 at 21:24, Josh Elser <el...@apache.org> wrote:
>>>
>>>> (-cc user@hbase, +bcc user@hbase)
>>>>
>>>> How about the rest of the stacktrace? You didn't share the cause.
>>>>
>>>> On 8/20/18 1:35 PM, Mich Talebzadeh wrote:
>>>> >
>>>> > This was working fine before my Hbase upgrade to 1.2.6
>>>> >
>>>> > I have Hbase version 1.2.6 and Phoenix
>>>> > version apache-phoenix-4.8.1-HBase-1.2-bin
>>>> >
>>>> > This command bulkloading into Hbase through phoenix failsnow fails
>>>> >
>>>> > HADOOP_CLASSPATH=${HOME}/jars/hbase-protocol-1.2.6.jar:${HBASE_HOME}/conf
>>>> hadoop
>>>> > jar ${HBASE_HOME}/lib/phoenix-4.8.1-HBase-1.2-client.jar
>>>> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table ${TABLE_NAME}
>>>> > --input hdfs://rhes75:9000/${REFINED_HBASE_SUB_DIR}/${FILE_NAME}_${
>>>> dir}.txt
>>>> >
>>>> > hadoop jar
>>>> > /data6/hduser/hbase-1.2.6/lib/phoenix-4.8.1-HBase-1.2-client.jar
>>>> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table
>>>> > MARKETDATAHBASEBATCH --input
>>>> > hdfs://rhes75:9000//data/prices/2018-08-20_refined/
>>>> populate_Phoenix_table_MARKETDATAHBASEBATCH_2018-08-20.txt
>>>> > +
>>>> > HADOOP_CLASSPATH=/home/hduser/jars/hbase-protocol-1.2.6.jar:
>>>> /data6/hduser/hbase-1.2.6/conf
>>>> >
>>>> >
>>>> > With the following error
>>>> >
>>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>>> > environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
>>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>>> > environment:java.io.tmpdir=/tmp
>>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>>> > environment:java.compiler=<NA>
>>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>>> > environment:os.name <http://os.name>=Linux
>>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>>> > environment:os.arch=amd64
>>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>>> > environment:os.version=3.10.0-862.3.2.el7.x86_64
>>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>>> > environment:user.name <http://user.name>=hduser
>>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>>> > environment:user.home=/home/hduser
>>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>>> > environment:user.dir=/data6/hduser/streaming_data/2018-08-20
>>>> > 2018-08-20 18:29:47,249 INFO  [main] zookeeper.ZooKeeper: Initiating
>>>> > client connection, connectString=rhes75:2181 sessionTimeout=90000
>>>> > watcher=hconnection-0x493d44230x0, quorum=rhes75:2181,
>>>> baseZNode=/hbase
>>>> > 2018-08-20 18:29:47,261 INFO  [main-SendThread(rhes75:2181)]
>>>> > zookeeper.ClientCnxn: Opening socket connection to server
>>>> > rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>. Will not
>>>> > attempt to authenticate using SASL (unknown error)
>>>> > 2018-08-20 18:29:47,264 INFO  [main-SendThread(rhes75:2181)]
>>>> > zookeeper.ClientCnxn: Socket connection established to
>>>> > rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>, initiating
>>>> session
>>>> > 2018-08-20 18:29:47,281 INFO  [main-SendThread(rhes75:2181)]
>>>> > zookeeper.ClientCnxn: Session establishment complete on server
>>>> > rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>, sessionid =
>>>> > 0x1002ea99eed0077, negotiated timeout = 40000
>>>> > Exception in thread "main" java.sql.SQLException: ERROR 103 (08004):
>>>> > Unable to establish connection.
>>>> >          at
>>>> > org.apache.phoenix.exception.SQLExceptionCode$Factory$1.
>>>> newException(SQLExceptionCode.java:455)
>>>> >
>>>> > Any thoughts?
>>>> >
>>>> > Thanks
>>>> >
>>>> > Dr Mich Talebzadeh
>>>> >
>>>> > LinkedIn
>>>> > /https://www.linkedin.com/profile/view?id=
>>>> AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/
>>>> >
>>>> > http://talebzadehmich.wordpress.com
>>>> >
>>>> >
>>>> > *Disclaimer:* Use it at your own risk.Any and all responsibility for
>>>> any
>>>> > loss, damage or destruction of data or any other property which may
>>>> > arise from relying on this email's technical content is explicitly
>>>> > disclaimed. The author will in no case be liable for any monetary
>>>> > damages arising from such loss, damage or destruction.
>>>> >
>>>>
>>>

Re: Phoenix CsvBulkLoadTool fails with java.sql.SQLException: ERROR 103 (08004): Unable to establish connection

Posted by Mich Talebzadeh <mi...@gmail.com>.
Hi,

The Hadoop version is Hadoop 3.1.0. Hbase is 1.2.6 and Phoenix is
apache-phoenix-4.8.1-HBase-1.2-bin

In the past I had issues with Hbase 2 working with Hadoop 3.1 so I had to
use Hbase 1.2.6. The individual components work fine. In other words I can
do all operations on Hbase with Hadoop 3.1 and Phoenix.

The issue I am facing is using both
org.apache.phoenix.mapreduce.CsvBulkLoadTool and hbase.mapreduce.ImportTsv
utilities.

So I presume the issue may be to do with both these command line tools not
working with Hadoop 3.1?

Thanks

Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Tue, 21 Aug 2018 at 00:48, Sergey Soldatov <se...@gmail.com>
wrote:

> If I read it correctly you are trying to use Phoenix and HBase that were
> built against Hadoop 2 with Hadoop 3. Is HBase was the only component you
> have upgraded?
>
> Thanks,
> Sergey
>
> On Mon, Aug 20, 2018 at 1:42 PM Mich Talebzadeh <mi...@gmail.com>
> wrote:
>
>> Here you go
>>
>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> environment:java.io.tmpdir=/tmp
>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> environment:java.compiler=<NA>
>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> environment:os.name=Linux
>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> environment:os.arch=amd64
>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> environment:os.version=3.10.0-862.3.2.el7.x86_64
>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> environment:user.name=hduser
>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> environment:user.home=/home/hduser
>> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> environment:user.dir=/data6/hduser/streaming_data/2018-08-20
>> 2018-08-20 18:29:47,249 INFO  [main] zookeeper.ZooKeeper: Initiating
>> client connection, connectString=rhes75:2181 sessionTimeout=90000
>> watcher=hconnection-0x493d44230x0, quorum=rhes75:2181, baseZNode=/hbase
>> 2018-08-20 18:29:47,261 INFO  [main-SendThread(rhes75:2181)]
>> zookeeper.ClientCnxn: Opening socket connection to server rhes75/
>> 50.140.197.220:2181. Will not attempt to authenticate using SASL
>> (unknown error)
>> 2018-08-20 18:29:47,264 INFO  [main-SendThread(rhes75:2181)]
>> zookeeper.ClientCnxn: Socket connection established to rhes75/
>> 50.140.197.220:2181, initiating session
>> 2018-08-20 18:29:47,281 INFO  [main-SendThread(rhes75:2181)]
>> zookeeper.ClientCnxn: Session establishment complete on server rhes75/
>> 50.140.197.220:2181, sessionid = 0x1002ea99eed0077, negotiated timeout =
>> 40000
>> Exception in thread "main" java.sql.SQLException: ERROR 103 (08004):
>> Unable to establish connection.
>>         at
>> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:455)
>>         at
>> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
>>         at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:386)
>>         at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:222)
>>         at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2318)
>>         at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2294)
>>         at
>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
>>         at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2294)
>>         at
>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:232)
>>         at
>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:147)
>>         at
>> org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
>>         at java.sql.DriverManager.getConnection(DriverManager.java:664)
>>         at java.sql.DriverManager.getConnection(DriverManager.java:208)
>>         at
>> org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:340)
>>         at
>> org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:332)
>>         at
>> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:209)
>>         at
>> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:183)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>>         at
>> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:101)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:498)
>>         at org.apache.hadoop.util.RunJar.run(RunJar.java:308)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:222)
>> Caused by: java.io.IOException:
>> java.lang.reflect.InvocationTargetException
>>         at
>> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
>>         at
>> org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:431)
>>         at
>> org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:340)
>>         at
>> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:144)
>>         at
>> org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
>>         at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:383)
>>         ... 23 more
>> Caused by: java.lang.reflect.InvocationTargetException
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>         at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>         at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>         at
>> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
>>         ... 28 more
>> Caused by: java.lang.IllegalAccessError: class
>> org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its superinterface
>> org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator
>>         at java.lang.ClassLoader.defineClass1(Native Method)
>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>         at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>         at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:348)
>>         at
>> java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
>>         at
>> java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
>>         at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
>>         at
>> org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3268)
>>         at
>> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3313)
>>         at
>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3352)
>>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
>>         at
>> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3403)
>>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3371)
>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477)
>>         at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
>>         at
>> org.apache.hadoop.hbase.util.DynamicClassLoader.initTempDir(DynamicClassLoader.java:120)
>>         at
>> org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:98)
>>         at
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:242)
>>         at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
>>         at
>> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
>>         at
>> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105)
>>         at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:905)
>>         at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:648)
>>         ... 33 more
>>
>> Thanks
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
>> On Mon, 20 Aug 2018 at 21:24, Josh Elser <el...@apache.org> wrote:
>>
>>> (-cc user@hbase, +bcc user@hbase)
>>>
>>> How about the rest of the stacktrace? You didn't share the cause.
>>>
>>> On 8/20/18 1:35 PM, Mich Talebzadeh wrote:
>>> >
>>> > This was working fine before my Hbase upgrade to 1.2.6
>>> >
>>> > I have Hbase version 1.2.6 and Phoenix
>>> > version apache-phoenix-4.8.1-HBase-1.2-bin
>>> >
>>> > This command bulkloading into Hbase through phoenix failsnow fails
>>> >
>>> >
>>> HADOOP_CLASSPATH=${HOME}/jars/hbase-protocol-1.2.6.jar:${HBASE_HOME}/conf
>>> hadoop
>>> > jar ${HBASE_HOME}/lib/phoenix-4.8.1-HBase-1.2-client.jar
>>> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table ${TABLE_NAME}
>>> > --input
>>> hdfs://rhes75:9000/${REFINED_HBASE_SUB_DIR}/${FILE_NAME}_${dir}.txt
>>> >
>>> > hadoop jar
>>> > /data6/hduser/hbase-1.2.6/lib/phoenix-4.8.1-HBase-1.2-client.jar
>>> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table
>>> > MARKETDATAHBASEBATCH --input
>>> >
>>> hdfs://rhes75:9000//data/prices/2018-08-20_refined/populate_Phoenix_table_MARKETDATAHBASEBATCH_2018-08-20.txt
>>> > +
>>> >
>>> HADOOP_CLASSPATH=/home/hduser/jars/hbase-protocol-1.2.6.jar:/data6/hduser/hbase-1.2.6/conf
>>> >
>>> >
>>> > With the following error
>>> >
>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> > environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> > environment:java.io.tmpdir=/tmp
>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> > environment:java.compiler=<NA>
>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> > environment:os.name <http://os.name>=Linux
>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> > environment:os.arch=amd64
>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> > environment:os.version=3.10.0-862.3.2.el7.x86_64
>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> > environment:user.name <http://user.name>=hduser
>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> > environment:user.home=/home/hduser
>>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>>> > environment:user.dir=/data6/hduser/streaming_data/2018-08-20
>>> > 2018-08-20 18:29:47,249 INFO  [main] zookeeper.ZooKeeper: Initiating
>>> > client connection, connectString=rhes75:2181 sessionTimeout=90000
>>> > watcher=hconnection-0x493d44230x0, quorum=rhes75:2181, baseZNode=/hbase
>>> > 2018-08-20 18:29:47,261 INFO  [main-SendThread(rhes75:2181)]
>>> > zookeeper.ClientCnxn: Opening socket connection to server
>>> > rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>. Will not
>>> > attempt to authenticate using SASL (unknown error)
>>> > 2018-08-20 18:29:47,264 INFO  [main-SendThread(rhes75:2181)]
>>> > zookeeper.ClientCnxn: Socket connection established to
>>> > rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>, initiating
>>> session
>>> > 2018-08-20 18:29:47,281 INFO  [main-SendThread(rhes75:2181)]
>>> > zookeeper.ClientCnxn: Session establishment complete on server
>>> > rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>, sessionid =
>>> > 0x1002ea99eed0077, negotiated timeout = 40000
>>> > Exception in thread "main" java.sql.SQLException: ERROR 103 (08004):
>>> > Unable to establish connection.
>>> >          at
>>> >
>>> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:455)
>>> >
>>> > Any thoughts?
>>> >
>>> > Thanks
>>> >
>>> > Dr Mich Talebzadeh
>>> >
>>> > LinkedIn
>>> > /
>>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/
>>> >
>>> > http://talebzadehmich.wordpress.com
>>> >
>>> >
>>> > *Disclaimer:* Use it at your own risk.Any and all responsibility for
>>> any
>>> > loss, damage or destruction of data or any other property which may
>>> > arise from relying on this email's technical content is explicitly
>>> > disclaimed. The author will in no case be liable for any monetary
>>> > damages arising from such loss, damage or destruction.
>>> >
>>>
>>

Re: Phoenix CsvBulkLoadTool fails with java.sql.SQLException: ERROR 103 (08004): Unable to establish connection

Posted by Sergey Soldatov <se...@gmail.com>.
If I read it correctly you are trying to use Phoenix and HBase that were
built against Hadoop 2 with Hadoop 3. Is HBase was the only component you
have upgraded?

Thanks,
Sergey

On Mon, Aug 20, 2018 at 1:42 PM Mich Talebzadeh <mi...@gmail.com>
wrote:

> Here you go
>
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> environment:java.io.tmpdir=/tmp
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> environment:java.compiler=<NA>
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> environment:os.name=Linux
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> environment:os.arch=amd64
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> environment:os.version=3.10.0-862.3.2.el7.x86_64
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> environment:user.name=hduser
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> environment:user.home=/home/hduser
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> environment:user.dir=/data6/hduser/streaming_data/2018-08-20
> 2018-08-20 18:29:47,249 INFO  [main] zookeeper.ZooKeeper: Initiating
> client connection, connectString=rhes75:2181 sessionTimeout=90000
> watcher=hconnection-0x493d44230x0, quorum=rhes75:2181, baseZNode=/hbase
> 2018-08-20 18:29:47,261 INFO  [main-SendThread(rhes75:2181)]
> zookeeper.ClientCnxn: Opening socket connection to server rhes75/
> 50.140.197.220:2181. Will not attempt to authenticate using SASL (unknown
> error)
> 2018-08-20 18:29:47,264 INFO  [main-SendThread(rhes75:2181)]
> zookeeper.ClientCnxn: Socket connection established to rhes75/
> 50.140.197.220:2181, initiating session
> 2018-08-20 18:29:47,281 INFO  [main-SendThread(rhes75:2181)]
> zookeeper.ClientCnxn: Session establishment complete on server rhes75/
> 50.140.197.220:2181, sessionid = 0x1002ea99eed0077, negotiated timeout =
> 40000
> Exception in thread "main" java.sql.SQLException: ERROR 103 (08004):
> Unable to establish connection.
>         at
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:455)
>         at
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:386)
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:222)
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2318)
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2294)
>         at
> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2294)
>         at
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:232)
>         at
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:147)
>         at
> org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
>         at java.sql.DriverManager.getConnection(DriverManager.java:664)
>         at java.sql.DriverManager.getConnection(DriverManager.java:208)
>         at
> org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:340)
>         at
> org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:332)
>         at
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:209)
>         at
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:183)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>         at
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:101)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at org.apache.hadoop.util.RunJar.run(RunJar.java:308)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:222)
> Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
>         at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
>         at
> org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:431)
>         at
> org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:340)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:144)
>         at
> org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:383)
>         ... 23 more
> Caused by: java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
>         ... 28 more
> Caused by: java.lang.IllegalAccessError: class
> org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its superinterface
> org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>         at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:348)
>         at
> java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
>         at
> java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
>         at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
>         at
> org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3268)
>         at
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3313)
>         at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3352)
>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
>         at
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3403)
>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3371)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477)
>         at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
>         at
> org.apache.hadoop.hbase.util.DynamicClassLoader.initTempDir(DynamicClassLoader.java:120)
>         at
> org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:98)
>         at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:242)
>         at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
>         at
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
>         at
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105)
>         at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:905)
>         at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:648)
>         ... 33 more
>
> Thanks
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Mon, 20 Aug 2018 at 21:24, Josh Elser <el...@apache.org> wrote:
>
>> (-cc user@hbase, +bcc user@hbase)
>>
>> How about the rest of the stacktrace? You didn't share the cause.
>>
>> On 8/20/18 1:35 PM, Mich Talebzadeh wrote:
>> >
>> > This was working fine before my Hbase upgrade to 1.2.6
>> >
>> > I have Hbase version 1.2.6 and Phoenix
>> > version apache-phoenix-4.8.1-HBase-1.2-bin
>> >
>> > This command bulkloading into Hbase through phoenix failsnow fails
>> >
>> >
>> HADOOP_CLASSPATH=${HOME}/jars/hbase-protocol-1.2.6.jar:${HBASE_HOME}/conf
>> hadoop
>> > jar ${HBASE_HOME}/lib/phoenix-4.8.1-HBase-1.2-client.jar
>> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table ${TABLE_NAME}
>> > --input
>> hdfs://rhes75:9000/${REFINED_HBASE_SUB_DIR}/${FILE_NAME}_${dir}.txt
>> >
>> > hadoop jar
>> > /data6/hduser/hbase-1.2.6/lib/phoenix-4.8.1-HBase-1.2-client.jar
>> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table
>> > MARKETDATAHBASEBATCH --input
>> >
>> hdfs://rhes75:9000//data/prices/2018-08-20_refined/populate_Phoenix_table_MARKETDATAHBASEBATCH_2018-08-20.txt
>> > +
>> >
>> HADOOP_CLASSPATH=/home/hduser/jars/hbase-protocol-1.2.6.jar:/data6/hduser/hbase-1.2.6/conf
>> >
>> >
>> > With the following error
>> >
>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> > environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> > environment:java.io.tmpdir=/tmp
>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> > environment:java.compiler=<NA>
>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> > environment:os.name <http://os.name>=Linux
>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> > environment:os.arch=amd64
>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> > environment:os.version=3.10.0-862.3.2.el7.x86_64
>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> > environment:user.name <http://user.name>=hduser
>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> > environment:user.home=/home/hduser
>> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
>> > environment:user.dir=/data6/hduser/streaming_data/2018-08-20
>> > 2018-08-20 18:29:47,249 INFO  [main] zookeeper.ZooKeeper: Initiating
>> > client connection, connectString=rhes75:2181 sessionTimeout=90000
>> > watcher=hconnection-0x493d44230x0, quorum=rhes75:2181, baseZNode=/hbase
>> > 2018-08-20 18:29:47,261 INFO  [main-SendThread(rhes75:2181)]
>> > zookeeper.ClientCnxn: Opening socket connection to server
>> > rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>. Will not
>> > attempt to authenticate using SASL (unknown error)
>> > 2018-08-20 18:29:47,264 INFO  [main-SendThread(rhes75:2181)]
>> > zookeeper.ClientCnxn: Socket connection established to
>> > rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>, initiating
>> session
>> > 2018-08-20 18:29:47,281 INFO  [main-SendThread(rhes75:2181)]
>> > zookeeper.ClientCnxn: Session establishment complete on server
>> > rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>, sessionid =
>> > 0x1002ea99eed0077, negotiated timeout = 40000
>> > Exception in thread "main" java.sql.SQLException: ERROR 103 (08004):
>> > Unable to establish connection.
>> >          at
>> >
>> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:455)
>> >
>> > Any thoughts?
>> >
>> > Thanks
>> >
>> > Dr Mich Talebzadeh
>> >
>> > LinkedIn
>> > /
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/
>> >
>> > http://talebzadehmich.wordpress.com
>> >
>> >
>> > *Disclaimer:* Use it at your own risk.Any and all responsibility for
>> any
>> > loss, damage or destruction of data or any other property which may
>> > arise from relying on this email's technical content is explicitly
>> > disclaimed. The author will in no case be liable for any monetary
>> > damages arising from such loss, damage or destruction.
>> >
>>
>

Re: Phoenix CsvBulkLoadTool fails with java.sql.SQLException: ERROR 103 (08004): Unable to establish connection

Posted by Mich Talebzadeh <mi...@gmail.com>.
Here you go

2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:java.io.tmpdir=/tmp
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:java.compiler=<NA>
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:os.name=Linux
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:os.arch=amd64
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:os.version=3.10.0-862.3.2.el7.x86_64
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:user.name=hduser
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:user.home=/home/hduser
2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
environment:user.dir=/data6/hduser/streaming_data/2018-08-20
2018-08-20 18:29:47,249 INFO  [main] zookeeper.ZooKeeper: Initiating client
connection, connectString=rhes75:2181 sessionTimeout=90000
watcher=hconnection-0x493d44230x0, quorum=rhes75:2181, baseZNode=/hbase
2018-08-20 18:29:47,261 INFO  [main-SendThread(rhes75:2181)]
zookeeper.ClientCnxn: Opening socket connection to server rhes75/
50.140.197.220:2181. Will not attempt to authenticate using SASL (unknown
error)
2018-08-20 18:29:47,264 INFO  [main-SendThread(rhes75:2181)]
zookeeper.ClientCnxn: Socket connection established to rhes75/
50.140.197.220:2181, initiating session
2018-08-20 18:29:47,281 INFO  [main-SendThread(rhes75:2181)]
zookeeper.ClientCnxn: Session establishment complete on server rhes75/
50.140.197.220:2181, sessionid = 0x1002ea99eed0077, negotiated timeout =
40000
Exception in thread "main" java.sql.SQLException: ERROR 103 (08004): Unable
to establish connection.
        at
org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:455)
        at
org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
        at
org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:386)
        at
org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:222)
        at
org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2318)
        at
org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2294)
        at
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
        at
org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2294)
        at
org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:232)
        at
org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:147)
        at
org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:208)
        at
org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:340)
        at
org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:332)
        at
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:209)
        at
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:183)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
        at
org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:101)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:308)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:222)
Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
        at
org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
        at
org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:431)
        at
org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:340)
        at
org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:144)
        at
org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
        at
org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:383)
        ... 23 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at
org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
        ... 28 more
Caused by: java.lang.IllegalAccessError: class
org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its superinterface
org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
        at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
        at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
        at
org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3268)
        at
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3313)
        at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3352)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
        at
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3403)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3371)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
        at
org.apache.hadoop.hbase.util.DynamicClassLoader.initTempDir(DynamicClassLoader.java:120)
        at
org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:98)
        at
org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:242)
        at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
        at
org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
        at
org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105)
        at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:905)
        at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:648)
        ... 33 more

Thanks

Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Mon, 20 Aug 2018 at 21:24, Josh Elser <el...@apache.org> wrote:

> (-cc user@hbase, +bcc user@hbase)
>
> How about the rest of the stacktrace? You didn't share the cause.
>
> On 8/20/18 1:35 PM, Mich Talebzadeh wrote:
> >
> > This was working fine before my Hbase upgrade to 1.2.6
> >
> > I have Hbase version 1.2.6 and Phoenix
> > version apache-phoenix-4.8.1-HBase-1.2-bin
> >
> > This command bulkloading into Hbase through phoenix failsnow fails
> >
> >
> HADOOP_CLASSPATH=${HOME}/jars/hbase-protocol-1.2.6.jar:${HBASE_HOME}/conf
> hadoop
> > jar ${HBASE_HOME}/lib/phoenix-4.8.1-HBase-1.2-client.jar
> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table ${TABLE_NAME}
> > --input
> hdfs://rhes75:9000/${REFINED_HBASE_SUB_DIR}/${FILE_NAME}_${dir}.txt
> >
> > hadoop jar
> > /data6/hduser/hbase-1.2.6/lib/phoenix-4.8.1-HBase-1.2-client.jar
> > org.apache.phoenix.mapreduce.CsvBulkLoadTool --table
> > MARKETDATAHBASEBATCH --input
> >
> hdfs://rhes75:9000//data/prices/2018-08-20_refined/populate_Phoenix_table_MARKETDATAHBASEBATCH_2018-08-20.txt
> > +
> >
> HADOOP_CLASSPATH=/home/hduser/jars/hbase-protocol-1.2.6.jar:/data6/hduser/hbase-1.2.6/conf
> >
> >
> > With the following error
> >
> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> > environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> > environment:java.io.tmpdir=/tmp
> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> > environment:java.compiler=<NA>
> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> > environment:os.name <http://os.name>=Linux
> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> > environment:os.arch=amd64
> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> > environment:os.version=3.10.0-862.3.2.el7.x86_64
> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> > environment:user.name <http://user.name>=hduser
> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> > environment:user.home=/home/hduser
> > 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client
> > environment:user.dir=/data6/hduser/streaming_data/2018-08-20
> > 2018-08-20 18:29:47,249 INFO  [main] zookeeper.ZooKeeper: Initiating
> > client connection, connectString=rhes75:2181 sessionTimeout=90000
> > watcher=hconnection-0x493d44230x0, quorum=rhes75:2181, baseZNode=/hbase
> > 2018-08-20 18:29:47,261 INFO  [main-SendThread(rhes75:2181)]
> > zookeeper.ClientCnxn: Opening socket connection to server
> > rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>. Will not
> > attempt to authenticate using SASL (unknown error)
> > 2018-08-20 18:29:47,264 INFO  [main-SendThread(rhes75:2181)]
> > zookeeper.ClientCnxn: Socket connection established to
> > rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>, initiating
> session
> > 2018-08-20 18:29:47,281 INFO  [main-SendThread(rhes75:2181)]
> > zookeeper.ClientCnxn: Session establishment complete on server
> > rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>, sessionid =
> > 0x1002ea99eed0077, negotiated timeout = 40000
> > Exception in thread "main" java.sql.SQLException: ERROR 103 (08004):
> > Unable to establish connection.
> >          at
> >
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:455)
> >
> > Any thoughts?
> >
> > Thanks
> >
> > Dr Mich Talebzadeh
> >
> > LinkedIn
> > /
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/
> >
> > http://talebzadehmich.wordpress.com
> >
> >
> > *Disclaimer:* Use it at your own risk.Any and all responsibility for any
> > loss, damage or destruction of data or any other property which may
> > arise from relying on this email's technical content is explicitly
> > disclaimed. The author will in no case be liable for any monetary
> > damages arising from such loss, damage or destruction.
> >
>

Re: Phoenix CsvBulkLoadTool fails with java.sql.SQLException: ERROR 103 (08004): Unable to establish connection

Posted by Josh Elser <el...@apache.org>.
(-cc user@hbase, +bcc user@hbase)

How about the rest of the stacktrace? You didn't share the cause.

On 8/20/18 1:35 PM, Mich Talebzadeh wrote:
> 
> This was working fine before my Hbase upgrade to 1.2.6
> 
> I have Hbase version 1.2.6 and Phoenix 
> version apache-phoenix-4.8.1-HBase-1.2-bin
> 
> This command bulkloading into Hbase through phoenix failsnow fails
> 
> HADOOP_CLASSPATH=${HOME}/jars/hbase-protocol-1.2.6.jar:${HBASE_HOME}/conf hadoop 
> jar ${HBASE_HOME}/lib/phoenix-4.8.1-HBase-1.2-client.jar 
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table ${TABLE_NAME} 
> --input hdfs://rhes75:9000/${REFINED_HBASE_SUB_DIR}/${FILE_NAME}_${dir}.txt
> 
> hadoop jar 
> /data6/hduser/hbase-1.2.6/lib/phoenix-4.8.1-HBase-1.2-client.jar 
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table 
> MARKETDATAHBASEBATCH --input 
> hdfs://rhes75:9000//data/prices/2018-08-20_refined/populate_Phoenix_table_MARKETDATAHBASEBATCH_2018-08-20.txt
> + 
> HADOOP_CLASSPATH=/home/hduser/jars/hbase-protocol-1.2.6.jar:/data6/hduser/hbase-1.2.6/conf
> 
> 
> With the following error
> 
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:java.io.tmpdir=/tmp
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:java.compiler=<NA>
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:os.name <http://os.name>=Linux
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:os.arch=amd64
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:os.version=3.10.0-862.3.2.el7.x86_64
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:user.name <http://user.name>=hduser
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:user.home=/home/hduser
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:user.dir=/data6/hduser/streaming_data/2018-08-20
> 2018-08-20 18:29:47,249 INFO  [main] zookeeper.ZooKeeper: Initiating 
> client connection, connectString=rhes75:2181 sessionTimeout=90000 
> watcher=hconnection-0x493d44230x0, quorum=rhes75:2181, baseZNode=/hbase
> 2018-08-20 18:29:47,261 INFO  [main-SendThread(rhes75:2181)] 
> zookeeper.ClientCnxn: Opening socket connection to server 
> rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>. Will not 
> attempt to authenticate using SASL (unknown error)
> 2018-08-20 18:29:47,264 INFO  [main-SendThread(rhes75:2181)] 
> zookeeper.ClientCnxn: Socket connection established to 
> rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>, initiating session
> 2018-08-20 18:29:47,281 INFO  [main-SendThread(rhes75:2181)] 
> zookeeper.ClientCnxn: Session establishment complete on server 
> rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>, sessionid = 
> 0x1002ea99eed0077, negotiated timeout = 40000
> Exception in thread "main" java.sql.SQLException: ERROR 103 (08004): 
> Unable to establish connection.
>          at 
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:455)
> 
> Any thoughts?
> 
> Thanks
> 
> Dr Mich Talebzadeh
> 
> LinkedIn 
> /https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/
> 
> http://talebzadehmich.wordpress.com
> 
> 
> *Disclaimer:* Use it at your own risk.Any and all responsibility for any 
> loss, damage or destruction of data or any other property which may 
> arise from relying on this email's technical content is explicitly 
> disclaimed. The author will in no case be liable for any monetary 
> damages arising from such loss, damage or destruction.
> 

Re: Phoenix CsvBulkLoadTool fails with java.sql.SQLException: ERROR 103 (08004): Unable to establish connection

Posted by Josh Elser <el...@apache.org>.
(-cc user@hbase, +bcc user@hbase)

How about the rest of the stacktrace? You didn't share the cause.

On 8/20/18 1:35 PM, Mich Talebzadeh wrote:
> 
> This was working fine before my Hbase upgrade to 1.2.6
> 
> I have Hbase version 1.2.6 and Phoenix 
> version apache-phoenix-4.8.1-HBase-1.2-bin
> 
> This command bulkloading into Hbase through phoenix failsnow fails
> 
> HADOOP_CLASSPATH=${HOME}/jars/hbase-protocol-1.2.6.jar:${HBASE_HOME}/conf hadoop 
> jar ${HBASE_HOME}/lib/phoenix-4.8.1-HBase-1.2-client.jar 
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table ${TABLE_NAME} 
> --input hdfs://rhes75:9000/${REFINED_HBASE_SUB_DIR}/${FILE_NAME}_${dir}.txt
> 
> hadoop jar 
> /data6/hduser/hbase-1.2.6/lib/phoenix-4.8.1-HBase-1.2-client.jar 
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table 
> MARKETDATAHBASEBATCH --input 
> hdfs://rhes75:9000//data/prices/2018-08-20_refined/populate_Phoenix_table_MARKETDATAHBASEBATCH_2018-08-20.txt
> + 
> HADOOP_CLASSPATH=/home/hduser/jars/hbase-protocol-1.2.6.jar:/data6/hduser/hbase-1.2.6/conf
> 
> 
> With the following error
> 
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:java.library.path=/home/hduser/hadoop-3.1.0/lib
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:java.io.tmpdir=/tmp
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:java.compiler=<NA>
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:os.name <http://os.name>=Linux
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:os.arch=amd64
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:os.version=3.10.0-862.3.2.el7.x86_64
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:user.name <http://user.name>=hduser
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:user.home=/home/hduser
> 2018-08-20 18:29:47,248 INFO  [main] zookeeper.ZooKeeper: Client 
> environment:user.dir=/data6/hduser/streaming_data/2018-08-20
> 2018-08-20 18:29:47,249 INFO  [main] zookeeper.ZooKeeper: Initiating 
> client connection, connectString=rhes75:2181 sessionTimeout=90000 
> watcher=hconnection-0x493d44230x0, quorum=rhes75:2181, baseZNode=/hbase
> 2018-08-20 18:29:47,261 INFO  [main-SendThread(rhes75:2181)] 
> zookeeper.ClientCnxn: Opening socket connection to server 
> rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>. Will not 
> attempt to authenticate using SASL (unknown error)
> 2018-08-20 18:29:47,264 INFO  [main-SendThread(rhes75:2181)] 
> zookeeper.ClientCnxn: Socket connection established to 
> rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>, initiating session
> 2018-08-20 18:29:47,281 INFO  [main-SendThread(rhes75:2181)] 
> zookeeper.ClientCnxn: Session establishment complete on server 
> rhes75/50.140.197.220:2181 <http://50.140.197.220:2181>, sessionid = 
> 0x1002ea99eed0077, negotiated timeout = 40000
> Exception in thread "main" java.sql.SQLException: ERROR 103 (08004): 
> Unable to establish connection.
>          at 
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:455)
> 
> Any thoughts?
> 
> Thanks
> 
> Dr Mich Talebzadeh
> 
> LinkedIn 
> /https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/
> 
> http://talebzadehmich.wordpress.com
> 
> 
> *Disclaimer:* Use it at your own risk.Any and all responsibility for any 
> loss, damage or destruction of data or any other property which may 
> arise from relying on this email's technical content is explicitly 
> disclaimed. The author will in no case be liable for any monetary 
> damages arising from such loss, damage or destruction.
>