You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yifan Guo (JIRA)" <ji...@apache.org> on 2019/02/26 07:25:00 UTC

[jira] [Created] (SPARK-26993) _minRegisteredRatio default value is zero not 0.8 for Yarn

Yifan Guo created SPARK-26993:
---------------------------------

             Summary: _minRegisteredRatio default value is zero not 0.8 for Yarn
                 Key: SPARK-26993
                 URL: https://issues.apache.org/jira/browse/SPARK-26993
             Project: Spark
          Issue Type: Question
          Components: YARN
    Affects Versions: 2.4.0
            Reporter: Yifan Guo


private[spark]

class CoarseGrainedSchedulerBackend(scheduler: TaskSchedulerImpl, val rpcEnv: RpcEnv)
 extends ExecutorAllocationClient with SchedulerBackend with Logging {

 // Use an atomic variable to track total number of cores in the cluster for simplicity and speed
 protected val totalCoreCount = new AtomicInteger(0)
 // Total number of executors that are currently registered
 protected val totalRegisteredExecutors = new AtomicInteger(0)
 protected val conf = scheduler.sc.conf
 private val maxRpcMessageSize = RpcUtils.maxMessageSizeBytes(conf)
 private val defaultAskTimeout = RpcUtils.askRpcTimeout(conf)
 // Submit tasks only after (registered resources / total expected resources)
 // is equal to at least this value, that is double between 0 and 1.
 private val _minRegisteredRatio =
 math.min(1, conf.getDouble("spark.scheduler.minRegister*edResourcesRatio", 0))*

 

override val minRegisteredRatio =
 if (conf.getOption("spark.scheduler.minRegisteredResourcesRatio").isEmpty) {
 0.8
 } else {
 super.minRegisteredRatio
 }

 

Apparently, if "spark.scheduler.minRegisteredResourcesRatio" is not configured, default value is zero not 0.8

 

is that on purpose ? 

 

 

 

 

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org