You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "ASF GitHub Bot (Jira)" <ji...@apache.org> on 2023/02/05 14:31:00 UTC

[jira] [Commented] (HADOOP-18618) Support custom property for credential provider path

    [ https://issues.apache.org/jira/browse/HADOOP-18618?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17684309#comment-17684309 ] 

ASF GitHub Bot commented on HADOOP-18618:
-----------------------------------------

surendralilhore opened a new pull request, #5352:
URL: https://github.com/apache/hadoop/pull/5352

   Hadoop allows the configuration of a credential provider path through the property "hadoop.security.credential.provider.path", and the Configuration#getPassword() method retrieves the credentials from this provider.
   
   However, using common credential provider properties for components like Hive, HDFS, and MapReduce can cause issues when they want to configure separate JCEKS files for credentials. For example, the value in the core-site.xml property file can be overridden by the hive-site.xml property file. To resolve this, all components should share a common credential provider path and add all their credentials.
   
   Azure storage supports account-specific credentials, and thus the credential provider should permit the configuration of separate JCEKS files for each account, such as the property "fs.azure.account.credential.provider.path.<account>.blob.core.windows.net".
   
   To accommodate this, the Configuration#getPassword() method should accept a custom property for the credential provider path and retrieve its value. The current default property can be overridden to achieve this.
   
   public char[] getPassword(String name) throws IOException {
       ......
       ......
   }
   
   
   public char[] getPassword(String name, String providerKey) throws IOException {                  
       ......
       ......
    }




> Support custom property for credential provider path
> ----------------------------------------------------
>
>                 Key: HADOOP-18618
>                 URL: https://issues.apache.org/jira/browse/HADOOP-18618
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: common
>    Affects Versions: 3.1.3
>            Reporter: Surendra Singh Lilhore
>            Assignee: Surendra Singh Lilhore
>            Priority: Minor
>
> Hadoop allows the configuration of a credential provider path through the property "{*}hadoop.security.credential.provider.path{*}", and the {{Configuration#getPassword()}} method retrieves the credentials from this provider.
> However, using common credential provider properties for components like Hive, HDFS, and MapReduce can cause issues when they want to configure separate JCEKS files for credentials. For example, the value in the core-site.xml property file can be overridden by the hive-site.xml property file. To resolve this, all components should share a common credential provider path and add all their credentials.
> Azure storage supports account-specific credentials, and thus the credential provider should permit the configuration of separate JCEKS files for each account, such as the property "{*}fs.azure.account.credential.provider.path.<account>.blob.core.windows.net{*}".
> To accommodate this, the {{Configuration#getPassword()}} method should accept a custom property for the credential provider path and retrieve its value. The current default property can be overridden to achieve this.
> {code:java}
> public char[] getPassword(String name) throws IOException {
>     ......
>     ......
> }
> public char[] getPassword(String name, String providerKey) throws IOException {                  
>     ......
>     ......
>  }{code}
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org