You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Charles Lamb (JIRA)" <ji...@apache.org> on 2015/01/27 19:09:35 UTC

[jira] [Resolved] (HADOOP-11478) HttpFSServer does not properly impersonate a real user when executing "open" operation in a kerberised environment

     [ https://issues.apache.org/jira/browse/HADOOP-11478?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Charles Lamb resolved HADOOP-11478.
-----------------------------------
    Resolution: Not a Problem

> HttpFSServer does not properly impersonate a real user when executing "open" operation in a kerberised environment
> ------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-11478
>                 URL: https://issues.apache.org/jira/browse/HADOOP-11478
>             Project: Hadoop Common
>          Issue Type: Bug
>    Affects Versions: 2.6.0
>         Environment: CentOS
>            Reporter: Ranadip
>            Priority: Blocker
>
> Setup:
> - Kerberos enabled in the cluster, including Hue SSO
> - Encryption enabled using KMS. Encryption key and encryption zone created. KMS key level ACL created to allow only real user to have all access to the key and no one else.
> Manifestation:
> Using Hue, real user logged in using Kerberos credentials. For direct access, user does kinit and then uses curl calls.
> New file creation inside encryption zone goes ahead fine as expected. 
> But attempts to view the contents of the file fails with exception:
> "User [httpfs] is not authorized to perform [DECRYPT_EEK] on key with ACL name [mykeyname]!!"
> Perhaps, this is linked to bug #HDFS-6849. In the file HttpFSServer.java, the OPEN handler calls command.execute(fs) directly (and this fails). In CREATE, that call is wrapped within fsExecute(user, command). Apparently, this seems to cause the problem.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)