You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Christian Kadner (JIRA)" <ji...@apache.org> on 2015/07/15 00:09:07 UTC
[jira] [Updated] (SPARK-7357) Improving HBaseTest example
[ https://issues.apache.org/jira/browse/SPARK-7357?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Christian Kadner updated SPARK-7357:
------------------------------------
Labels: spark.tc (was: )
> Improving HBaseTest example
> ---------------------------
>
> Key: SPARK-7357
> URL: https://issues.apache.org/jira/browse/SPARK-7357
> Project: Spark
> Issue Type: Improvement
> Components: Examples
> Affects Versions: 1.3.1
> Reporter: Jihong MA
> Assignee: Jihong MA
> Priority: Minor
> Labels: spark.tc
> Fix For: 1.5.0
>
> Original Estimate: 2m
> Remaining Estimate: 2m
>
> Minor improvement to HBaseTest example, when Hbase related configurations e.g: zookeeper quorum, zookeeper client port or zookeeper.znode.parent are not set to default (localhost:2181), connection to zookeeper might hang as shown in following stack
> 15/03/26 18:31:20 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=xxx.xxx.xxx:2181 sessionTimeout=90000 watcher=hconnection-0x322a4437, quorum=xxx.xxx.xxx:2181, baseZNode=/hbase
> 15/03/26 18:31:21 INFO zookeeper.ClientCnxn: Opening socket connection to server 9.30.94.121:2181. Will not attempt to authenticate using SASL (unknown error)
> 15/03/26 18:31:21 INFO zookeeper.ClientCnxn: Socket connection established to xxx.xxx.xxx/9.30.94.121:2181, initiating session
> 15/03/26 18:31:21 INFO zookeeper.ClientCnxn: Session establishment complete on server xxx.xxx.xxx/9.30.94.121:2181, sessionid = 0x14c53cd311e004b, negotiated timeout = 40000
> 15/03/26 18:31:21 INFO client.ZooKeeperRegistry: ClusterId read in ZooKeeper is null
> this is due to hbase-site.xml is not placed on spark class path.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org