You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yuming Wang (JIRA)" <ji...@apache.org> on 2019/04/24 09:29:00 UTC

[jira] [Updated] (SPARK-27354) Move incompatible code from the hive-thriftserver module to sql/hive-thriftserver/v1.2.1

     [ https://issues.apache.org/jira/browse/SPARK-27354?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Yuming Wang updated SPARK-27354:
--------------------------------
    Summary: Move incompatible code from the hive-thriftserver module to sql/hive-thriftserver/v1.2.1  (was: Add a new empty hive-thriftserver module for Hive 2.3.4 )

> Move incompatible code from the hive-thriftserver module to sql/hive-thriftserver/v1.2.1
> ----------------------------------------------------------------------------------------
>
>                 Key: SPARK-27354
>                 URL: https://issues.apache.org/jira/browse/SPARK-27354
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Yuming Wang
>            Priority: Major
>
> When we upgraded the built-in Hive to 2.3.4, the current {{hive-thriftserver}} module is not compatible, such as these Hive changes:
>  # HIVE-12442 HiveServer2: Refactor/repackage HiveServer2's Thrift code so that it can be used in the tasks
>  # HIVE-12237 Use slf4j as logging facade
>  # HIVE-13169 HiveServer2: Support delegation token based connection when using http transport
> So we should add a new {{hive-thriftserver}} module for Hive 2.3.4:
> 1. Add a new empty module for Hive 2.3.4 named {{hive-thriftserverV2}}.
> 2. Make {{hive-thriftserver}} can only be activated when testing with hadoop-2.7.
> 3. Make {{hive-thriftserverV2}} can only be activated when testing with hadoop-3.2.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org