You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Diego Fustes Villadóniga (JIRA)" <ji...@apache.org> on 2017/05/04 06:32:04 UTC

[jira] [Commented] (SPARK-5159) Thrift server does not respect hive.server2.enable.doAs=true

    [ https://issues.apache.org/jira/browse/SPARK-5159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15996249#comment-15996249 ] 

Diego Fustes Villadóniga commented on SPARK-5159:
-------------------------------------------------

We are also experimenting problems with version 2.0.0 when activating impersonation on a kerberized cluster. In our case, it seems that impersonation is working for SELECT queries. However, when we run a CREATE TABLE SELECT or an INSERT we receive an exception. The reason is that it is the service user who is writing in the hive staging area instead of the impersonated user. 

Impersonation is crucial in order to secure the access to the cluster resources, so please look at this asap.



> Thrift server does not respect hive.server2.enable.doAs=true
> ------------------------------------------------------------
>
>                 Key: SPARK-5159
>                 URL: https://issues.apache.org/jira/browse/SPARK-5159
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0
>            Reporter: Andrew Ray
>         Attachments: spark_thrift_server_log.txt
>
>
> I'm currently testing the spark sql thrift server on a kerberos secured cluster in YARN mode. Currently any user can access any table regardless of HDFS permissions as all data is read as the hive user. In HiveServer2 the property hive.server2.enable.doAs=true causes all access to be done as the submitting user. We should do the same.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org