You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by GitBox <gi...@apache.org> on 2022/03/14 15:33:57 UTC

[GitHub] [hadoop] jclarysse opened a new pull request #4070: Hadoop 18154 trunk

jclarysse opened a new pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070


   ### Description of PR
   The PR addresses a requirement to comply with AWS security concept [IAM roles for service accounts](https://docs.aws.amazon.com/eks/latest/userguide/iam-roles-for-service-accounts.html) (IRSA) while operating a service that isn't based on Apache Spark and that runs inside Amazon Elastic Kubernetes Service (EKS).
   
   The code change consists in adding a new credentials provider class `org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider` to the module [hadoop-aws](https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/index.html).
   
   ### How was this patch tested?
   No new unit-test or integration-test was created on-purpose. The patch was "only" tested based on [Hadoop release 2.10.1](https://github.com/apache/hadoop/tree/rel/release-2.10.1), as part of our specific use-case based on [Delta sharing service](https://github.com/delta-io/delta-sharing) v0.4.0 along with the following Hadoop configuration (core-site.xml):
   ```
   <?xml version="1.0"?>
   <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
   <configuration>
     <property>
       <name>fs.s3a.aws.credentials.provider</name>
       <value>org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider</value>
     </property>
     <property>
       <name>fs.s3a.jwt.path</name>
       <value>/var/run/secrets/eks.amazonaws.com/serviceaccount/token</value>
     </property>
     <property>
       <name>fs.s3a.role.arn</name>
       <value>my_iam_role_arn</value>
     </property>
     <property>
       <name>fs.s3a.session.name</name>
       <value>my_iam_session_name</value>
     </property>
     <property>
         <name>fs.s3a.server-side-encryption-algorithm</name>
         <value>SSE-KMS</value>
     </property>
     <property>
         <name>fs.s3a.server-side-encryption.key</name>
         <value>my_kms_key_id</value>
     </property>      
   </configuration>
   ```
   
   ### For code changes:
   - [X] The title or this PR starts with the corresponding JIRA issue 'HADOOP-18154'
   - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation?
   - [X] No new dependency was added to the code.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] jclarysse commented on a change in pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
jclarysse commented on a change in pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#discussion_r827164296



##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,79 @@
+package org.apache.hadoop.fs.s3a;
+
+import org.apache.commons.lang3.StringUtils;
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+/**
+ * WebIdentityTokenCredentialsProvider supports static configuration
+ * of OIDC token path, role ARN and role session name.
+ *
+ */
+//@InterfaceAudience.Public
+//@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+    public static final String NAME
+            = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+    //these are the parameters to document and to pass along with the class
+    //usually from import static org.apache.hadoop.fs.s3a.Constants.*;
+    public static final String JWT_PATH = "fs.s3a.jwt.path";
+    public static final String ROLE_ARN = "fs.s3a.role.arn";
+    public static final String SESSION_NAME = "fs.s3a.session.name";
+
+    /** Reuse the S3AFileSystem log. */
+    private static final Logger LOG = S3AFileSystem.LOG;
+
+    private String jwtPath;
+    private String roleARN;
+    private String sessionName;
+    private IOException lookupIOE;
+
+    public OIDCTokenCredentialsProvider(Configuration conf) {
+        try {
+            Configuration c = ProviderUtils.excludeIncompatibleCredentialProviders(
+                    conf, S3AFileSystem.class);
+            this.jwtPath = S3AUtils.lookupPassword(c, JWT_PATH, null);
+            this.roleARN = S3AUtils.lookupPassword(c, ROLE_ARN, null);
+            this.sessionName = S3AUtils.lookupPassword(c, SESSION_NAME, null);
+        } catch (IOException e) {
+            lookupIOE = e;
+        }
+    }
+
+    public AWSCredentials getCredentials() {
+        if (lookupIOE != null) {
+            // propagate any initialization problem
+            throw new CredentialInitializationException(lookupIOE.toString(),
+                    lookupIOE);
+        }
+
+        LOG.debug("jwtPath {} roleARN {}", jwtPath, roleARN);
+
+        if (!StringUtils.isEmpty(jwtPath) && !StringUtils.isEmpty(roleARN)) {
+            final AWSCredentialsProvider credentialsProvider =
+                WebIdentityTokenCredentialsProvider.builder()
+                    .webIdentityTokenFile(jwtPath)

Review comment:
       Not sure i get your point. Our service runs accross multiple Kubernetes pods (replicas) using a ServiceAccount, so that any of those pods is automatically attached a volume pointing to a token file created with the same ServiceAcocunt signature. Perhaps i am missing other use-cases within the Hadoop ecosystem?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] steveloughran commented on a change in pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
steveloughran commented on a change in pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#discussion_r831445562



##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/Constants.java
##########
@@ -142,6 +142,10 @@ private Constants() {
   public static final String ASSUMED_ROLE_CREDENTIALS_DEFAULT =
       SimpleAWSCredentialsProvider.NAME;
 
+  /**
+   * Absolute path to the web identity token file

Review comment:
       nit, add a . at the end of the sentence. javadoc versions like that.
   
   also in docs, say "path in local/mounted filesystem" so it is clear it is not a cluster fs like HDFS

##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,105 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.fs.s3a;

Review comment:
       can you move to org.apache.hadoop.fs.s3a.auth

##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,105 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.fs.s3a;
+
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+
+import org.apache.commons.lang3.StringUtils;
+
+import org.apache.hadoop.classification.InterfaceAudience;
+import org.apache.hadoop.classification.InterfaceStability;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+import static org.apache.hadoop.fs.s3a.Constants.*;
+
+/**
+ * Support OpenID Connect (OIDC) token for authenticating with AWS.
+ *
+ * Please note that users may reference this class name from configuration
+ * property fs.s3a.aws.credentials.provider.  Therefore, changing the class name
+ * would be a backward-incompatible change.
+ *
+ * This credential provider must not fail in creation because that will
+ * break a chain of credential providers.
+ */
+@InterfaceAudience.Public
+@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+  public static final String NAME
+          = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+  /** Reuse the S3AFileSystem log. */
+  private static final Logger LOG = S3AFileSystem.LOG;
+
+  private String jwtPath;
+  private String roleARN;
+  private String sessionName;
+  private IOException lookupIOE;
+
+  public OIDCTokenCredentialsProvider(Configuration conf) {
+    try {
+      Configuration c = ProviderUtils.excludeIncompatibleCredentialProviders(
+                  conf, S3AFileSystem.class);
+      this.jwtPath = S3AUtils.lookupPassword(c, JWT_PATH, null);
+      this.roleARN = S3AUtils.lookupPassword(c, ASSUMED_ROLE_ARN, null);
+      this.sessionName = S3AUtils.lookupPassword(c, ASSUMED_ROLE_SESSION_NAME, null);
+    } catch (IOException e) {
+      lookupIOE = e;
+    }
+  }
+
+  public AWSCredentials getCredentials() {
+      if (lookupIOE != null) {
+          // propagate any initialization problem
+          throw new CredentialInitializationException(lookupIOE.toString(),
+                  lookupIOE);
+      }
+
+      LOG.debug("jwtPath {} roleARN {} sessionName {}", jwtPath, roleARN, sessionName);
+
+      if (!StringUtils.isEmpty(jwtPath) && !StringUtils.isEmpty(roleARN)) {
+          final AWSCredentialsProvider credentialsProvider =
+              WebIdentityTokenCredentialsProvider.builder()
+                  .webIdentityTokenFile(jwtPath)
+                  .roleArn(roleARN)
+                  .roleSessionName(sessionName)
+                  .build();
+          return credentialsProvider.getCredentials();
+      }
+      else throw new CredentialInitializationException(
+              "OIDC token path or role ARN is null");
+  }
+
+  public void refresh() {}
+
+  @Override
+  public String toString() {
+      return String.format("%s " +
+                  "jwtPath {%s} roleARN {%s} sessionName {%s}",
+              getClass().getSimpleName(),
+              jwtPath, roleARN, sessionName);
+  }
+}

Review comment:
       nit, add a newline

##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,105 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.fs.s3a;
+
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+
+import org.apache.commons.lang3.StringUtils;
+
+import org.apache.hadoop.classification.InterfaceAudience;
+import org.apache.hadoop.classification.InterfaceStability;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+import static org.apache.hadoop.fs.s3a.Constants.*;
+
+/**
+ * Support OpenID Connect (OIDC) token for authenticating with AWS.
+ *
+ * Please note that users may reference this class name from configuration
+ * property fs.s3a.aws.credentials.provider.  Therefore, changing the class name
+ * would be a backward-incompatible change.
+ *
+ * This credential provider must not fail in creation because that will
+ * break a chain of credential providers.
+ */
+@InterfaceAudience.Public
+@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+  public static final String NAME
+          = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+  /** Reuse the S3AFileSystem log. */
+  private static final Logger LOG = S3AFileSystem.LOG;
+
+  private String jwtPath;
+  private String roleARN;
+  private String sessionName;
+  private IOException lookupIOE;
+
+  public OIDCTokenCredentialsProvider(Configuration conf) {
+    try {
+      Configuration c = ProviderUtils.excludeIncompatibleCredentialProviders(
+                  conf, S3AFileSystem.class);
+      this.jwtPath = S3AUtils.lookupPassword(c, JWT_PATH, null);
+      this.roleARN = S3AUtils.lookupPassword(c, ASSUMED_ROLE_ARN, null);
+      this.sessionName = S3AUtils.lookupPassword(c, ASSUMED_ROLE_SESSION_NAME, null);
+    } catch (IOException e) {
+      lookupIOE = e;
+    }
+  }
+
+  public AWSCredentials getCredentials() {
+      if (lookupIOE != null) {
+          // propagate any initialization problem
+          throw new CredentialInitializationException(lookupIOE.toString(),
+                  lookupIOE);
+      }
+
+      LOG.debug("jwtPath {} roleARN {} sessionName {}", jwtPath, roleARN, sessionName);
+
+      if (!StringUtils.isEmpty(jwtPath) && !StringUtils.isEmpty(roleARN)) {
+          final AWSCredentialsProvider credentialsProvider =
+              WebIdentityTokenCredentialsProvider.builder()
+                  .webIdentityTokenFile(jwtPath)
+                  .roleArn(roleARN)
+                  .roleSessionName(sessionName)
+                  .build();
+          return credentialsProvider.getCredentials();
+      }
+      else throw new CredentialInitializationException(

Review comment:
       better to throw explicit errors when the options are unset.

##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,105 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.fs.s3a;
+
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+
+import org.apache.commons.lang3.StringUtils;
+

Review comment:
       you can cut this line




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] dannycjones commented on a change in pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
dannycjones commented on a change in pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#discussion_r832005639



##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/Constants.java
##########
@@ -142,6 +142,10 @@ private Constants() {
   public static final String ASSUMED_ROLE_CREDENTIALS_DEFAULT =
       SimpleAWSCredentialsProvider.NAME;
 
+  /**
+   * Absolute path to the web identity token file
+   */
+  public static final String JWT_PATH = "fs.s3a.jwt.path";

Review comment:
       If its for OIDC / WebIdentity, can we change to something like `fs.s3a.oidc.jwt.path` / `fs.s3a.webidentity.jwt.path`?

##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,105 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.fs.s3a;
+
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+
+import org.apache.commons.lang3.StringUtils;
+
+import org.apache.hadoop.classification.InterfaceAudience;
+import org.apache.hadoop.classification.InterfaceStability;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+import static org.apache.hadoop.fs.s3a.Constants.*;
+
+/**
+ * Support OpenID Connect (OIDC) token for authenticating with AWS.
+ *
+ * Please note that users may reference this class name from configuration
+ * property fs.s3a.aws.credentials.provider.  Therefore, changing the class name
+ * would be a backward-incompatible change.
+ *
+ * This credential provider must not fail in creation because that will
+ * break a chain of credential providers.
+ */

Review comment:
       Does this credential provider actually support more than just Open ID Connect - anything that vends an identity under a JWT?
   
   This provider is allowing users to configure Role ARN, JWT path, and session name for the [SDK WebIdentityTokenCredentialsProvider](https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/auth/WebIdentityTokenCredentialsProvider.html). Should we move to similar naming?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] steveloughran commented on pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
steveloughran commented on pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#issuecomment-1075401037


   Looking at the `WebIdentityTokenCredentialsProvider` I see that if it doesn't get the parameters then it will fall back to environment variables. We absolutely do not want to be picking up env vars as it will only create support issues where configurations only work on a certain machines. (actually, we can ignore the session name settings as they are harmless)
   
   I'm going to propose we go with @dannycjones's suggestion and support the whole set of values and have the prefix `fs.s3a.webidentity` for all of them.
   
   for the arn, we could have a property `fs.s3a.webidentity.role.arn` 
   
   but, what should we do if it wasn't set?
   1. fail to initialize
   2. have that null value force the env var lookup.
   
   I don't see any way to a completely block the environment variable resolution, which is a pain.
   
   I also see in the internal Library classes that sometimes roles are set up with an external ID, but it is not possible here. Is that an issue?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] jclarysse commented on a change in pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
jclarysse commented on a change in pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#discussion_r826971919



##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,79 @@
+package org.apache.hadoop.fs.s3a;
+
+import org.apache.commons.lang3.StringUtils;
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+/**
+ * WebIdentityTokenCredentialsProvider supports static configuration
+ * of OIDC token path, role ARN and role session name.
+ *
+ */
+//@InterfaceAudience.Public
+//@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+    public static final String NAME
+            = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+    //these are the parameters to document and to pass along with the class
+    //usually from import static org.apache.hadoop.fs.s3a.Constants.*;
+    public static final String JWT_PATH = "fs.s3a.jwt.path";
+    public static final String ROLE_ARN = "fs.s3a.role.arn";

Review comment:
       done (reference ASSUMED_ROLE_ARN and ASSUMED_ROLE_SESSION_NAME)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] dannycjones commented on a change in pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
dannycjones commented on a change in pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#discussion_r832005639



##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/Constants.java
##########
@@ -142,6 +142,10 @@ private Constants() {
   public static final String ASSUMED_ROLE_CREDENTIALS_DEFAULT =
       SimpleAWSCredentialsProvider.NAME;
 
+  /**
+   * Absolute path to the web identity token file
+   */
+  public static final String JWT_PATH = "fs.s3a.jwt.path";

Review comment:
       If its for OIDC / WebIdentity, can we change to something like `fs.s3a.oidc.jwt.path` / `fs.s3a.webidentity.jwt.path`?
   
   + add `@value` to JavaDoc for IDE hints?

##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/Constants.java
##########
@@ -142,6 +142,10 @@ private Constants() {
   public static final String ASSUMED_ROLE_CREDENTIALS_DEFAULT =
       SimpleAWSCredentialsProvider.NAME;
 
+  /**
+   * Absolute path to the web identity token file
+   */
+  public static final String JWT_PATH = "fs.s3a.jwt.path";

Review comment:
       If its for OIDC / WebIdentity, can we change to something like `fs.s3a.oidc.jwt.path` / `fs.s3a.webidentity.jwt.path`?
   
   Also, add `@value` to JavaDoc for IDE hints?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#issuecomment-1068327215


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m 53s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.  |
   |||| _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  36m 59s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 42s |  |  trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  compile  |   0m 35s |  |  trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 24s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 41s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 22s |  |  trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 29s |  |  trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m  5s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  23m 18s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 32s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 35s |  |  the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javac  |   0m 35s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 29s |  |  the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 29s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   0m 17s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/6/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) |  hadoop-tools/hadoop-aws: The patch generated 13 new + 0 unchanged - 0 fixed = 13 total (was 0)  |
   | +1 :green_heart: |  mvnsite  |   0m 34s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 15s |  |  the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 24s |  |  the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m  6s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  23m 11s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 29s |  |  hadoop-aws in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 29s |  |  The patch does not generate ASF License warnings.  |
   |  |   |  96m 34s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/6/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4070 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell |
   | uname | Linux 7947a5edc92d 4.15.0-153-generic #160-Ubuntu SMP Thu Jul 29 06:54:29 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 93773618af72386ae99d37540557715cd46ba160 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/6/testReport/ |
   | Max. process+thread count | 571 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/6/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] jclarysse commented on a change in pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
jclarysse commented on a change in pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#discussion_r826971164



##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,79 @@
+package org.apache.hadoop.fs.s3a;
+
+import org.apache.commons.lang3.StringUtils;
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+/**
+ * WebIdentityTokenCredentialsProvider supports static configuration
+ * of OIDC token path, role ARN and role session name.
+ *
+ */
+//@InterfaceAudience.Public
+//@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+    public static final String NAME
+            = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+    //these are the parameters to document and to pass along with the class
+    //usually from import static org.apache.hadoop.fs.s3a.Constants.*;
+    public static final String JWT_PATH = "fs.s3a.jwt.path";

Review comment:
       done




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] steveloughran commented on a change in pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
steveloughran commented on a change in pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#discussion_r831444157



##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,79 @@
+package org.apache.hadoop.fs.s3a;
+
+import org.apache.commons.lang3.StringUtils;
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+/**
+ * WebIdentityTokenCredentialsProvider supports static configuration
+ * of OIDC token path, role ARN and role session name.
+ *
+ */
+//@InterfaceAudience.Public
+//@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+    public static final String NAME
+            = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+    //these are the parameters to document and to pass along with the class
+    //usually from import static org.apache.hadoop.fs.s3a.Constants.*;
+    public static final String JWT_PATH = "fs.s3a.jwt.path";
+    public static final String ROLE_ARN = "fs.s3a.role.arn";
+    public static final String SESSION_NAME = "fs.s3a.session.name";
+
+    /** Reuse the S3AFileSystem log. */
+    private static final Logger LOG = S3AFileSystem.LOG;
+
+    private String jwtPath;
+    private String roleARN;
+    private String sessionName;
+    private IOException lookupIOE;
+
+    public OIDCTokenCredentialsProvider(Configuration conf) {
+        try {
+            Configuration c = ProviderUtils.excludeIncompatibleCredentialProviders(
+                    conf, S3AFileSystem.class);
+            this.jwtPath = S3AUtils.lookupPassword(c, JWT_PATH, null);
+            this.roleARN = S3AUtils.lookupPassword(c, ROLE_ARN, null);
+            this.sessionName = S3AUtils.lookupPassword(c, SESSION_NAME, null);
+        } catch (IOException e) {
+            lookupIOE = e;
+        }
+    }
+
+    public AWSCredentials getCredentials() {
+        if (lookupIOE != null) {
+            // propagate any initialization problem
+            throw new CredentialInitializationException(lookupIOE.toString(),
+                    lookupIOE);
+        }
+
+        LOG.debug("jwtPath {} roleARN {}", jwtPath, roleARN);
+
+        if (!StringUtils.isEmpty(jwtPath) && !StringUtils.isEmpty(roleARN)) {
+            final AWSCredentialsProvider credentialsProvider =
+                WebIdentityTokenCredentialsProvider.builder()
+                    .webIdentityTokenFile(jwtPath)

Review comment:
       i was just wondering how the secrets get around. for other credentials we can pick them up from the user launching, say, a distcp job, and they will get passed round. alternatively, they can go into a cluster FS like hdfs.
   
   if it works with your k8s setup, then the docs should say "mount a shared volume in your containers".  support for credential propagation can be added by someone else when they needed it




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #4070: Hadoop 18154 trunk

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#issuecomment-1067077987


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m 56s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.  |
   |||| _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  36m 28s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 43s |  |  trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  compile  |   0m 36s |  |  trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 24s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 41s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 22s |  |  trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 30s |  |  trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m  4s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  23m 54s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 32s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 35s |  |  the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javac  |   0m 35s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 29s |  |  the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 29s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   0m 18s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/1/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) |  hadoop-tools/hadoop-aws: The patch generated 36 new + 0 unchanged - 0 fixed = 36 total (was 0)  |
   | +1 :green_heart: |  mvnsite  |   0m 35s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 16s |  |  the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 23s |  |  the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m  7s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  23m 11s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 33s |  |  hadoop-aws in the patch passed.  |
   | -1 :x: |  asflicense  |   0m 29s | [/results-asflicense.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/1/artifact/out/results-asflicense.txt) |  The patch generated 1 ASF License warnings.  |
   |  |   |  96m 43s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/1/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4070 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell |
   | uname | Linux 5e56e3e68c47 4.15.0-153-generic #160-Ubuntu SMP Thu Jul 29 06:54:29 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / d213b2ce9833bb7e482fbdebc76259713334c70f |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/1/testReport/ |
   | Max. process+thread count | 521 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/1/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] steveloughran commented on a change in pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
steveloughran commented on a change in pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#discussion_r826785122



##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,79 @@
+package org.apache.hadoop.fs.s3a;
+
+import org.apache.commons.lang3.StringUtils;
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+/**
+ * WebIdentityTokenCredentialsProvider supports static configuration
+ * of OIDC token path, role ARN and role session name.
+ *
+ */
+//@InterfaceAudience.Public
+//@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+    public static final String NAME
+            = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+    //these are the parameters to document and to pass along with the class
+    //usually from import static org.apache.hadoop.fs.s3a.Constants.*;
+    public static final String JWT_PATH = "fs.s3a.jwt.path";
+    public static final String ROLE_ARN = "fs.s3a.role.arn";

Review comment:
       and reference existing constants from the same class

##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,79 @@
+package org.apache.hadoop.fs.s3a;
+
+import org.apache.commons.lang3.StringUtils;
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+/**
+ * WebIdentityTokenCredentialsProvider supports static configuration
+ * of OIDC token path, role ARN and role session name.
+ *
+ */
+//@InterfaceAudience.Public
+//@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+    public static final String NAME
+            = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+    //these are the parameters to document and to pass along with the class
+    //usually from import static org.apache.hadoop.fs.s3a.Constants.*;
+    public static final String JWT_PATH = "fs.s3a.jwt.path";

Review comment:
       move new constants into org.apache.hadoop.fs.s3a.Constants

##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,79 @@
+package org.apache.hadoop.fs.s3a;
+
+import org.apache.commons.lang3.StringUtils;
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+/**
+ * WebIdentityTokenCredentialsProvider supports static configuration
+ * of OIDC token path, role ARN and role session name.
+ *
+ */
+//@InterfaceAudience.Public
+//@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+    public static final String NAME
+            = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+    //these are the parameters to document and to pass along with the class
+    //usually from import static org.apache.hadoop.fs.s3a.Constants.*;
+    public static final String JWT_PATH = "fs.s3a.jwt.path";
+    public static final String ROLE_ARN = "fs.s3a.role.arn";
+    public static final String SESSION_NAME = "fs.s3a.session.name";
+
+    /** Reuse the S3AFileSystem log. */
+    private static final Logger LOG = S3AFileSystem.LOG;
+
+    private String jwtPath;
+    private String roleARN;
+    private String sessionName;
+    private IOException lookupIOE;
+
+    public OIDCTokenCredentialsProvider(Configuration conf) {
+        try {
+            Configuration c = ProviderUtils.excludeIncompatibleCredentialProviders(
+                    conf, S3AFileSystem.class);
+            this.jwtPath = S3AUtils.lookupPassword(c, JWT_PATH, null);
+            this.roleARN = S3AUtils.lookupPassword(c, ROLE_ARN, null);
+            this.sessionName = S3AUtils.lookupPassword(c, SESSION_NAME, null);
+        } catch (IOException e) {
+            lookupIOE = e;
+        }
+    }
+
+    public AWSCredentials getCredentials() {
+        if (lookupIOE != null) {
+            // propagate any initialization problem
+            throw new CredentialInitializationException(lookupIOE.toString(),
+                    lookupIOE);
+        }
+
+        LOG.debug("jwtPath {} roleARN {}", jwtPath, roleARN);
+
+        if (!StringUtils.isEmpty(jwtPath) && !StringUtils.isEmpty(roleARN)) {
+            final AWSCredentialsProvider credentialsProvider =
+                WebIdentityTokenCredentialsProvider.builder()
+                    .webIdentityTokenFile(jwtPath)
+                    .roleArn(roleARN)
+                    .roleSessionName(sessionName)
+                    .build();
+            return credentialsProvider.getCredentials();
+        }
+        else throw new CredentialInitializationException(
+                "OIDC token path or role ARN is null");
+    }
+
+    public void refresh() {}
+
+    @Override
+    public String toString() {
+        return getClass().getSimpleName();

Review comment:
       be nice to include any non-secret values here, e.g. role name, just to help with logging

##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,79 @@
+package org.apache.hadoop.fs.s3a;
+
+import org.apache.commons.lang3.StringUtils;
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+/**
+ * WebIdentityTokenCredentialsProvider supports static configuration
+ * of OIDC token path, role ARN and role session name.
+ *
+ */
+//@InterfaceAudience.Public
+//@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+    public static final String NAME
+            = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+    //these are the parameters to document and to pass along with the class
+    //usually from import static org.apache.hadoop.fs.s3a.Constants.*;
+    public static final String JWT_PATH = "fs.s3a.jwt.path";
+    public static final String ROLE_ARN = "fs.s3a.role.arn";
+    public static final String SESSION_NAME = "fs.s3a.session.name";
+
+    /** Reuse the S3AFileSystem log. */
+    private static final Logger LOG = S3AFileSystem.LOG;
+
+    private String jwtPath;
+    private String roleARN;
+    private String sessionName;
+    private IOException lookupIOE;
+
+    public OIDCTokenCredentialsProvider(Configuration conf) {

Review comment:
       should credential providers be allowed to raise IOEs? we should be able to fix that

##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,79 @@
+package org.apache.hadoop.fs.s3a;
+
+import org.apache.commons.lang3.StringUtils;
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+/**
+ * WebIdentityTokenCredentialsProvider supports static configuration
+ * of OIDC token path, role ARN and role session name.
+ *
+ */
+//@InterfaceAudience.Public
+//@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+    public static final String NAME
+            = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+    //these are the parameters to document and to pass along with the class
+    //usually from import static org.apache.hadoop.fs.s3a.Constants.*;
+    public static final String JWT_PATH = "fs.s3a.jwt.path";
+    public static final String ROLE_ARN = "fs.s3a.role.arn";
+    public static final String SESSION_NAME = "fs.s3a.session.name";
+
+    /** Reuse the S3AFileSystem log. */
+    private static final Logger LOG = S3AFileSystem.LOG;
+
+    private String jwtPath;
+    private String roleARN;
+    private String sessionName;
+    private IOException lookupIOE;
+
+    public OIDCTokenCredentialsProvider(Configuration conf) {
+        try {
+            Configuration c = ProviderUtils.excludeIncompatibleCredentialProviders(
+                    conf, S3AFileSystem.class);
+            this.jwtPath = S3AUtils.lookupPassword(c, JWT_PATH, null);
+            this.roleARN = S3AUtils.lookupPassword(c, ROLE_ARN, null);
+            this.sessionName = S3AUtils.lookupPassword(c, SESSION_NAME, null);
+        } catch (IOException e) {
+            lookupIOE = e;
+        }
+    }
+
+    public AWSCredentials getCredentials() {
+        if (lookupIOE != null) {
+            // propagate any initialization problem
+            throw new CredentialInitializationException(lookupIOE.toString(),
+                    lookupIOE);
+        }
+
+        LOG.debug("jwtPath {} roleARN {}", jwtPath, roleARN);
+
+        if (!StringUtils.isEmpty(jwtPath) && !StringUtils.isEmpty(roleARN)) {
+            final AWSCredentialsProvider credentialsProvider =
+                WebIdentityTokenCredentialsProvider.builder()
+                    .webIdentityTokenFile(jwtPath)
+                    .roleArn(roleARN)
+                    .roleSessionName(sessionName)
+                    .build();
+            return credentialsProvider.getCredentials();
+        }

Review comment:
       nit: same line as the }

##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,79 @@
+package org.apache.hadoop.fs.s3a;
+
+import org.apache.commons.lang3.StringUtils;

Review comment:
       nit, we have some layout rules for imports.  here's my full intellij settings for this if it helps
   https://gist.github.com/steveloughran/817dd90e0f1775ce2b6f24684dfb078c

##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,79 @@
+package org.apache.hadoop.fs.s3a;
+
+import org.apache.commons.lang3.StringUtils;
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+/**
+ * WebIdentityTokenCredentialsProvider supports static configuration
+ * of OIDC token path, role ARN and role session name.
+ *
+ */
+//@InterfaceAudience.Public
+//@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+    public static final String NAME
+            = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+    //these are the parameters to document and to pass along with the class
+    //usually from import static org.apache.hadoop.fs.s3a.Constants.*;
+    public static final String JWT_PATH = "fs.s3a.jwt.path";
+    public static final String ROLE_ARN = "fs.s3a.role.arn";
+    public static final String SESSION_NAME = "fs.s3a.session.name";
+
+    /** Reuse the S3AFileSystem log. */
+    private static final Logger LOG = S3AFileSystem.LOG;
+
+    private String jwtPath;
+    private String roleARN;
+    private String sessionName;
+    private IOException lookupIOE;
+
+    public OIDCTokenCredentialsProvider(Configuration conf) {
+        try {
+            Configuration c = ProviderUtils.excludeIncompatibleCredentialProviders(
+                    conf, S3AFileSystem.class);
+            this.jwtPath = S3AUtils.lookupPassword(c, JWT_PATH, null);
+            this.roleARN = S3AUtils.lookupPassword(c, ROLE_ARN, null);
+            this.sessionName = S3AUtils.lookupPassword(c, SESSION_NAME, null);
+        } catch (IOException e) {
+            lookupIOE = e;
+        }
+    }
+
+    public AWSCredentials getCredentials() {
+        if (lookupIOE != null) {
+            // propagate any initialization problem
+            throw new CredentialInitializationException(lookupIOE.toString(),
+                    lookupIOE);
+        }
+
+        LOG.debug("jwtPath {} roleARN {}", jwtPath, roleARN);
+
+        if (!StringUtils.isEmpty(jwtPath) && !StringUtils.isEmpty(roleARN)) {
+            final AWSCredentialsProvider credentialsProvider =
+                WebIdentityTokenCredentialsProvider.builder()
+                    .webIdentityTokenFile(jwtPath)

Review comment:
       this will handle local files only, so won't work for jobs across a cluster unless the token is already there.
   
   either cluster fs paths will be needed (download locally and then reference) or we require it on the host of the user launching a job and then include the token data in a delegation token which goes with it. that's a lot more powerful -but a lot more work. best to leave that for a followup patch




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#issuecomment-1067954547


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   1m 29s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.  |
   |||| _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  38m  6s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 49s |  |  trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  compile  |   0m 41s |  |  trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 26s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 47s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 23s |  |  trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 32s |  |  trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 19s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  24m 12s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 36s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 42s |  |  the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javac  |   0m 42s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 32s |  |  the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 32s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   0m 19s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/2/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) |  hadoop-tools/hadoop-aws: The patch generated 21 new + 0 unchanged - 0 fixed = 21 total (was 0)  |
   | +1 :green_heart: |  mvnsite  |   0m 39s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 16s |  |  the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 27s |  |  the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 19s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  23m 22s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 41s |  |  hadoop-aws in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 31s |  |  The patch does not generate ASF License warnings.  |
   |  |   | 100m 34s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/2/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4070 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell |
   | uname | Linux fb302480fe5f 4.15.0-153-generic #160-Ubuntu SMP Thu Jul 29 06:54:29 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / f7207421d305b78d47f0b48ab3656b4bede78ac7 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/2/testReport/ |
   | Max. process+thread count | 525 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/2/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#issuecomment-1067955131


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   1m  3s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.  |
   |||| _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  37m 39s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 48s |  |  trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  compile  |   0m 39s |  |  trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 26s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 47s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 23s |  |  trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 33s |  |  trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 16s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  24m 12s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 38s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 43s |  |  the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javac  |   0m 43s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 33s |  |  the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 33s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   0m 19s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/3/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) |  hadoop-tools/hadoop-aws: The patch generated 21 new + 0 unchanged - 0 fixed = 21 total (was 0)  |
   | +1 :green_heart: |  mvnsite  |   0m 39s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 16s |  |  the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 26s |  |  the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 21s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  23m 46s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 48s |  |  hadoop-aws in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 30s |  |  The patch does not generate ASF License warnings.  |
   |  |   | 100m  7s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/3/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4070 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell |
   | uname | Linux f4fb3cfaeef7 4.15.0-153-generic #160-Ubuntu SMP Thu Jul 29 06:54:29 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / f7207421d305b78d47f0b48ab3656b4bede78ac7 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/3/testReport/ |
   | Max. process+thread count | 521 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/3/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#issuecomment-1068075795


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m 53s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.  |
   |||| _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  37m 22s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 47s |  |  trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  compile  |   0m 35s |  |  trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 25s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 40s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 22s |  |  trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 31s |  |  trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 14s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  23m 59s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 33s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 38s |  |  the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javac  |   0m 38s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 29s |  |  the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 29s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   0m 18s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/4/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) |  hadoop-tools/hadoop-aws: The patch generated 22 new + 0 unchanged - 0 fixed = 22 total (was 0)  |
   | +1 :green_heart: |  mvnsite  |   0m 35s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 15s |  |  the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 23s |  |  the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m  9s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  23m 39s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 35s |  |  hadoop-aws in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 28s |  |  The patch does not generate ASF License warnings.  |
   |  |   |  98m 20s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/4/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4070 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell |
   | uname | Linux 427efd63f1bc 4.15.0-153-generic #160-Ubuntu SMP Thu Jul 29 06:54:29 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 623ce669fd6b9945dfac8050b120985a88b0a632 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/4/testReport/ |
   | Max. process+thread count | 600 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/4/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#issuecomment-1068099293


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m 59s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.  |
   |||| _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  37m  1s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 43s |  |  trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  compile  |   0m 35s |  |  trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 25s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 43s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 23s |  |  trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 30s |  |  trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 18s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  24m 20s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 33s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 38s |  |  the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javac  |   0m 38s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 30s |  |  the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 30s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   0m 18s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/5/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) |  hadoop-tools/hadoop-aws: The patch generated 21 new + 0 unchanged - 0 fixed = 21 total (was 0)  |
   | +1 :green_heart: |  mvnsite  |   0m 34s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 16s |  |  the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04  |
   | +1 :green_heart: |  javadoc  |   0m 25s |  |  the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 11s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  23m 48s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |   2m 31s |  |  hadoop-aws in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 30s |  |  The patch does not generate ASF License warnings.  |
   |  |   |  98m 40s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/5/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4070 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell |
   | uname | Linux 12f0006b81cf 4.15.0-153-generic #160-Ubuntu SMP Thu Jul 29 06:54:29 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / e624ea1a7cc9224ac2f23ed69506711b9f52bb88 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/5/testReport/ |
   | Max. process+thread count | 521 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4070/5/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus removed a comment on pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
hadoop-yetus removed a comment on pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#issuecomment-1068327215






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] steveloughran commented on pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
steveloughran commented on pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#issuecomment-1083384625


   so as well as authing with a webidentidy token, we could use https://docs.aws.amazon.com/STS/latest/APIReference/API_AssumeRoleWithWebIdentity.html  to get role credentials for up to 12h. which could go into a delegation token. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] jclarysse commented on a change in pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
jclarysse commented on a change in pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#discussion_r826973395



##########
File path: hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/OIDCTokenCredentialsProvider.java
##########
@@ -0,0 +1,79 @@
+package org.apache.hadoop.fs.s3a;
+
+import org.apache.commons.lang3.StringUtils;
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.WebIdentityTokenCredentialsProvider;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.security.ProviderUtils;
+import org.slf4j.Logger;
+
+import java.io.IOException;
+
+/**
+ * WebIdentityTokenCredentialsProvider supports static configuration
+ * of OIDC token path, role ARN and role session name.
+ *
+ */
+//@InterfaceAudience.Public
+//@InterfaceStability.Stable
+public class OIDCTokenCredentialsProvider implements AWSCredentialsProvider {
+    public static final String NAME
+            = "org.apache.hadoop.fs.s3a.OIDCTokenCredentialsProvider";
+
+    //these are the parameters to document and to pass along with the class
+    //usually from import static org.apache.hadoop.fs.s3a.Constants.*;
+    public static final String JWT_PATH = "fs.s3a.jwt.path";
+    public static final String ROLE_ARN = "fs.s3a.role.arn";
+    public static final String SESSION_NAME = "fs.s3a.session.name";
+
+    /** Reuse the S3AFileSystem log. */
+    private static final Logger LOG = S3AFileSystem.LOG;
+
+    private String jwtPath;
+    private String roleARN;
+    private String sessionName;
+    private IOException lookupIOE;
+
+    public OIDCTokenCredentialsProvider(Configuration conf) {
+        try {
+            Configuration c = ProviderUtils.excludeIncompatibleCredentialProviders(
+                    conf, S3AFileSystem.class);
+            this.jwtPath = S3AUtils.lookupPassword(c, JWT_PATH, null);
+            this.roleARN = S3AUtils.lookupPassword(c, ROLE_ARN, null);
+            this.sessionName = S3AUtils.lookupPassword(c, SESSION_NAME, null);
+        } catch (IOException e) {
+            lookupIOE = e;
+        }
+    }
+
+    public AWSCredentials getCredentials() {
+        if (lookupIOE != null) {
+            // propagate any initialization problem
+            throw new CredentialInitializationException(lookupIOE.toString(),
+                    lookupIOE);
+        }
+
+        LOG.debug("jwtPath {} roleARN {}", jwtPath, roleARN);
+
+        if (!StringUtils.isEmpty(jwtPath) && !StringUtils.isEmpty(roleARN)) {
+            final AWSCredentialsProvider credentialsProvider =
+                WebIdentityTokenCredentialsProvider.builder()
+                    .webIdentityTokenFile(jwtPath)
+                    .roleArn(roleARN)
+                    .roleSessionName(sessionName)
+                    .build();
+            return credentialsProvider.getCredentials();
+        }
+        else throw new CredentialInitializationException(
+                "OIDC token path or role ARN is null");
+    }
+
+    public void refresh() {}
+
+    @Override
+    public String toString() {
+        return getClass().getSimpleName();

Review comment:
       done




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus removed a comment on pull request #4070: HADOOP-18154. S3A Authentication to support WebIdentity

Posted by GitBox <gi...@apache.org>.
hadoop-yetus removed a comment on pull request #4070:
URL: https://github.com/apache/hadoop/pull/4070#issuecomment-1068099293






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org