You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/04/01 08:55:44 UTC

[GitHub] [spark] gaborgsomogyi commented on a change in pull request #24170: [SPARK-26998][CORE] Remove SSL configuration from executors

gaborgsomogyi commented on a change in pull request #24170: [SPARK-26998][CORE] Remove SSL configuration from executors
URL: https://github.com/apache/spark/pull/24170#discussion_r270770830
 
 

 ##########
 File path: core/src/test/scala/org/apache/spark/SparkConfSuite.scala
 ##########
 @@ -354,6 +354,20 @@ class SparkConfSuite extends SparkFunSuite with LocalSparkContext with ResetSyst
     }
   }
 
+  test("SPARK-26998: SSL configuration not needed on executors") {
+    val conf = new SparkConf(false)
+    conf.validateSettings()
+
+    conf.set("spark.ssl.enabled", "true")
+    conf.set("spark.ssl.keyPassword", "password")
+    conf.set("spark.ssl.keyStorePassword", "password")
+    conf.set("spark.ssl.trustStorePassword", "password")
+    conf.validateSettings()
+
+    val filtered = conf.getAll.filter { case (k, _) => SparkConf.isExecutorStartupConf(k) }
 
 Review comment:
   The exception tells exactly what is the problematic parameter and value so not sure a more complex construct would help here:
   ```
   Array((spark.network.foo,bar)) was not empty
   ScalaTestFailureLocation: org.apache.spark.SparkConfSuite at (SparkConfSuite.scala:366)
   org.scalatest.exceptions.TestFailedException: Array((spark.network.foo,bar)) was not empty
   ```
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org