You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yu Gao (JIRA)" <ji...@apache.org> on 2015/04/14 01:28:12 UTC

[jira] [Commented] (SPARK-5111) HiveContext and Thriftserver cannot work in secure cluster beyond hadoop2.5

    [ https://issues.apache.org/jira/browse/SPARK-5111?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14493270#comment-14493270 ] 

Yu Gao commented on SPARK-5111:
-------------------------------

Hi Zhan, which spark version is going to have this fix? We ran into the same issue with Hadoop 2.6 + Kerberos, so would like to see this fixed in Spark. Thanks.

> HiveContext and Thriftserver cannot work in secure cluster beyond hadoop2.5
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-5111
>                 URL: https://issues.apache.org/jira/browse/SPARK-5111
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Zhan Zhang
>
> Due to "java.lang.NoSuchFieldError: SASL_PROPS" error. Need to backport some hive-0.14 fix into spark, since there is no effort to upgrade hive to 0.14 support in spark.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org