You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/03/15 21:14:00 UTC

[GitHub] [spark] attilapiros commented on a change in pull request #24103: [SPARK-27141][YARN] Use ConfigEntry for hardcoded configs for Yarn

attilapiros commented on a change in pull request #24103: [SPARK-27141][YARN] Use ConfigEntry for hardcoded configs for Yarn
URL: https://github.com/apache/spark/pull/24103#discussion_r266148863
 
 

 ##########
 File path: resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnAllocatorSuite.scala
 ##########
 @@ -48,8 +49,8 @@ class MockResolver extends SparkRackResolver {
 class YarnAllocatorSuite extends SparkFunSuite with Matchers with BeforeAndAfterEach {
   val conf = new YarnConfiguration()
   val sparkConf = new SparkConf()
-  sparkConf.set("spark.driver.host", "localhost")
-  sparkConf.set("spark.driver.port", "4040")
+  sparkConf.set(DRIVER_HOST_ADDRESS.key, "localhost")
+  sparkConf.set(DRIVER_PORT.key, "4040")
 
 Review comment:
   `SparkConf#set` is an overloaded method if you pass a `ConfigEntry` as a first argument it will be typesafe (in this case for `DRIVER_PORT` an Int is expected as second argument).
   
   If the first arg of `SparkConf#set` is a String then the value must be passed as String (not typesafe).
   
   I would prefer to use the typesafe setter and for this only the `.key` method calls should be removed from the ConfigEntries you are using at the calls of `SparkConf#set`.
   
   For the launcher (like in BaseYarnClusterSuite.scala) you can only pass String key when `setConf` is called.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org