You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/09/07 13:13:17 UTC

[GitHub] [spark] pralabhkumar commented on a change in pull request #33917: [SPARK-36622][CORE] Making spark.history.kerberos.principal _HOST compliant

pralabhkumar commented on a change in pull request #33917:
URL: https://github.com/apache/spark/pull/33917#discussion_r703497729



##########
File path: core/src/test/scala/org/apache/spark/deploy/SparkHadoopUtilSuite.scala
##########
@@ -80,6 +82,18 @@ class SparkHadoopUtilSuite extends SparkFunSuite {
     assertConfigValue(hadoopConf, "fs.s3a.endpoint", null)
   }
 
+  /**
+   * test for _HOST pattern replacement with Server cannonical address
+   */
+  test("server principal with _HOST pattern") {

Review comment:
       done

##########
File path: core/src/test/scala/org/apache/spark/deploy/SparkHadoopUtilSuite.scala
##########
@@ -80,6 +82,18 @@ class SparkHadoopUtilSuite extends SparkFunSuite {
     assertConfigValue(hadoopConf, "fs.s3a.endpoint", null)
   }
 
+  /**
+   * test for _HOST pattern replacement with Server cannonical address
+   */
+  test("server principal with _HOST pattern") {
+    assert(SparkHadoopUtil.get.getServerPrincipal("spark/_HOST@realm.com")
+      === "spark/%s@realm.com".format(InetAddress.getLocalHost.getCanonicalHostName())
+      , s"Mismatch in expected value")
+    assert(SparkHadoopUtil.get.getServerPrincipal("spark/0.0.0.0@realm.com")
+      === "spark/0.0.0.0@realm.com".format(InetAddress.getLocalHost.getCanonicalHostName())
+      , s"Mismatch in expected value")

Review comment:
       done




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org