You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by va...@apache.org on 2016/12/13 18:37:52 UTC

spark git commit: [SPARK-18840][YARN] Avoid throw exception when getting token renewal interval in non HDFS security environment

Repository: spark
Updated Branches:
  refs/heads/master 5572ccf86 -> 43298d157


[SPARK-18840][YARN] Avoid throw exception when getting token renewal interval in non HDFS security environment

## What changes were proposed in this pull request?

Fix `java.util.NoSuchElementException` when running Spark in non-hdfs security environment.

In the current code, we assume `HDFS_DELEGATION_KIND` token will be found in Credentials. But in some cloud environments, HDFS is not required, so we should avoid this exception.

## How was this patch tested?

Manually verified in local environment.

Author: jerryshao <ss...@hortonworks.com>

Closes #16265 from jerryshao/SPARK-18840.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/43298d15
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/43298d15
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/43298d15

Branch: refs/heads/master
Commit: 43298d157d58d5d03ffab818f8cdfc6eac783c55
Parents: 5572ccf
Author: jerryshao <ss...@hortonworks.com>
Authored: Tue Dec 13 10:37:45 2016 -0800
Committer: Marcelo Vanzin <va...@cloudera.com>
Committed: Tue Dec 13 10:37:45 2016 -0800

----------------------------------------------------------------------
 .../yarn/security/HDFSCredentialProvider.scala  | 21 ++++++++++----------
 1 file changed, 11 insertions(+), 10 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/43298d15/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/security/HDFSCredentialProvider.scala
----------------------------------------------------------------------
diff --git a/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/security/HDFSCredentialProvider.scala b/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/security/HDFSCredentialProvider.scala
index 8d06d73..ebb176b 100644
--- a/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/security/HDFSCredentialProvider.scala
+++ b/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/security/HDFSCredentialProvider.scala
@@ -72,21 +72,22 @@ private[security] class HDFSCredentialProvider extends ServiceCredentialProvider
     // We cannot use the tokens generated with renewer yarn. Trying to renew
     // those will fail with an access control issue. So create new tokens with the logged in
     // user as renewer.
-    sparkConf.get(PRINCIPAL).map { renewer =>
+    sparkConf.get(PRINCIPAL).flatMap { renewer =>
       val creds = new Credentials()
       nnsToAccess(hadoopConf, sparkConf).foreach { dst =>
         val dstFs = dst.getFileSystem(hadoopConf)
         dstFs.addDelegationTokens(renewer, creds)
       }
-      val t = creds.getAllTokens.asScala
-        .filter(_.getKind == DelegationTokenIdentifier.HDFS_DELEGATION_KIND)
-        .head
-      val newExpiration = t.renew(hadoopConf)
-      val identifier = new DelegationTokenIdentifier()
-      identifier.readFields(new DataInputStream(new ByteArrayInputStream(t.getIdentifier)))
-      val interval = newExpiration - identifier.getIssueDate
-      logInfo(s"Renewal Interval is $interval")
-      interval
+      val hdfsToken = creds.getAllTokens.asScala
+        .find(_.getKind == DelegationTokenIdentifier.HDFS_DELEGATION_KIND)
+      hdfsToken.map { t =>
+        val newExpiration = t.renew(hadoopConf)
+        val identifier = new DelegationTokenIdentifier()
+        identifier.readFields(new DataInputStream(new ByteArrayInputStream(t.getIdentifier)))
+        val interval = newExpiration - identifier.getIssueDate
+        logInfo(s"Renewal Interval is $interval")
+        interval
+      }
     }
   }
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org