You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Steve Loughran (JIRA)" <ji...@apache.org> on 2017/08/30 21:44:00 UTC

[jira] [Updated] (HADOOP-14821) Executing the command 'hdfs -Dhadoop.security.credential.provider.path=file1.jceks,file2.jceks' fails;

     [ https://issues.apache.org/jira/browse/HADOOP-14821?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Steve Loughran updated HADOOP-14821:
------------------------------------
    Component/s: security
                 fs/s3

> Executing the command 'hdfs -Dhadoop.security.credential.provider.path=file1.jceks,file2.jceks' fails;
> ------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-14821
>                 URL: https://issues.apache.org/jira/browse/HADOOP-14821
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: fs/s3, hdfs-client, security
>    Affects Versions: 2.7.3
>         Environment: hadoop-common-2.7.3.2.6.0.11-1
>            Reporter: Ernani Pereira de Mattos Junior
>            Priority: Critical
>              Labels: features
>
> ======= 
> Request Use Case: 
> UC1: 
> The customer has the path to a directory and subdirectories full of keys. The customer knows that he does not have the access to all the keys, but ignoring this problem, the customer makes a list of the keys. 
> UC1.2: 
> The customer in a FIFO manner, try his access to the key provided on the list. If the access is granted locally then he can try the login on the s3a. 
> UC1.2: 
> The customer in a FIFO manner, try his access to the key provided on the list. If the access is not granted locally then he will skip the login on the s3a and try the next key on the list. 
> ===========
> For now, the UC1.2 fails with below exception and does not try the next key:
> $ hdfs  --loglevel DEBUG dfs -Dhadoop.security.credential.provider.path=jceks://hdfs/tmp/aws.jceks,jceks://hdfs/tmp/awst.jceks -ls s3a://av-dl-hwx-nprod-anhffpoc-enriched/hive/e_ceod/
> Not retrying because try once and fail.
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=502549376, access=READ, inode="/tmp/aws.jceks":admin:hdfs:-rwx------



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org