You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2016/06/05 01:04:59 UTC

[jira] [Closed] (SPARK-11100) HiveThriftServer HA issue,HiveThriftServer not registering with Zookeeper

     [ https://issues.apache.org/jira/browse/SPARK-11100?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Reynold Xin closed SPARK-11100.
-------------------------------
    Resolution: Won't Fix

Thanks for the patch.

The thrift server has gone through a lot of changes lately, and we are not sure whether we'd want to keep the existing thrift server at all because it has a huge amount of technical debt. More so, I'm not sure if the zookeeper approach chosen by Hive is that great, since it is pretty difficult to operate in practice, so I'm not sure if this would become the long-term solution. For now, I'd recommend you patching your own version of Spark to get this functionality.


> HiveThriftServer HA issue,HiveThriftServer not registering with Zookeeper
> -------------------------------------------------------------------------
>
>                 Key: SPARK-11100
>                 URL: https://issues.apache.org/jira/browse/SPARK-11100
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.1
>         Environment: Hive-1.2.1
> Hadoop-2.6.0
>            Reporter: Xiaoyu Wang
>
> hive-site.xml config:
> {code}
> <property>
> <name>hive.server2.support.dynamic.service.discovery</name>
> <value>true</value>
> </property>
> <property>
> <name>hive.server2.zookeeper.namespace</name>
> <value>sparkhiveserver2</value>
> </property>
> <property>
> <name>hive.zookeeper.quorum</name>
> <value>zk1,zk2,zk3</value>
> </property>
> {code}
> then start thrift server
> {code}
> start-thriftserver.sh --master yarn
> {code}
> In zookeeper znode "sparkhiveserver2" not found.
> hiveserver2 is working on this config!



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org