You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Luciano Resende (JIRA)" <ji...@apache.org> on 2015/12/29 18:22:49 UTC

[jira] [Commented] (SPARK-5159) Thrift server does not respect hive.server2.enable.doAs=true

    [ https://issues.apache.org/jira/browse/SPARK-5159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15074085#comment-15074085 ] 

Luciano Resende commented on SPARK-5159:
----------------------------------------

Is this still an issue ? Most of the code on the initial PR seems to be merged via SPARK-6910 and when i try to run the Spark Hive sample in yarn mode (Spark 1.5.1) it seems to me that my user is getting impersonated and I get the proper exception saying my user does not have permission.

Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=lresende, access=WRITE, inode="/user/lresende/.sparkStaging/application_1450998431030_0001":hdfs:hdfs:drwxr-xr-x

Is there a specific scenario that this is still reproducible ?

> Thrift server does not respect hive.server2.enable.doAs=true
> ------------------------------------------------------------
>
>                 Key: SPARK-5159
>                 URL: https://issues.apache.org/jira/browse/SPARK-5159
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0
>            Reporter: Andrew Ray
>
> I'm currently testing the spark sql thrift server on a kerberos secured cluster in YARN mode. Currently any user can access any table regardless of HDFS permissions as all data is read as the hive user. In HiveServer2 the property hive.server2.enable.doAs=true causes all access to be done as the submitting user. We should do the same.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org