You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bolke de Bruin (JIRA)" <ji...@apache.org> on 2015/06/22 09:19:00 UTC

[jira] [Comment Edited] (SPARK-5111) HiveContext and Thriftserver cannot work in secure cluster beyond hadoop2.5

    [ https://issues.apache.org/jira/browse/SPARK-5111?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14595451#comment-14595451 ] 

Bolke de Bruin edited comment on SPARK-5111 at 6/22/15 7:18 AM:
----------------------------------------------------------------

Further analaysis gives that the latest spark (1.5.0-SNAPSHOT) from git won't connect to a secure cluster without a/the patch. This is due to the fact that hive 0.13 is still being used internally although you can specify 0.14 by configuration for some actions (0.14 is then loaded isolated). However the secure connection is required earlier in the process and thus still fails.


was (Author: bolke):
Further analaysis gives that the latest spark (1.5.0-SNAPSHOT) from git won't connect to a secure clsuter without a/the patch. This is due to the fact that hive 0.13 is still being used internally although you can specify 0.14 by configuration for some actions (0.14 is then loaded isolated). However the secure connection is required earlier in the process and thus still fails.

> HiveContext and Thriftserver cannot work in secure cluster beyond hadoop2.5
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-5111
>                 URL: https://issues.apache.org/jira/browse/SPARK-5111
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Zhan Zhang
>
> Due to "java.lang.NoSuchFieldError: SASL_PROPS" error. Need to backport some hive-0.14 fix into spark, since there is no effort to upgrade hive to 0.14 support in spark.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org