You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "zhuml (Jira)" <ji...@apache.org> on 2022/07/11 09:18:00 UTC

[jira] [Created] (SPARK-39742) Request executor after kill executor, the number of executors is not as expected

zhuml created SPARK-39742:
-----------------------------

             Summary: Request executor after kill executor, the number of executors is not as expected
                 Key: SPARK-39742
                 URL: https://issues.apache.org/jira/browse/SPARK-39742
             Project: Spark
          Issue Type: Bug
          Components: Scheduler
    Affects Versions: 3.2.1
            Reporter: zhuml


I used the killExecutors and requestExecutors function of SparkContext to dynamically adjust the resources, and found that the requestExecutors after killExecutors could not achieve the expected results.

Add unit tests in StandaloneDynamicAllocationSuite.scala 
{code:java}
test("kill executors first and then request") {
    sc = new SparkContext(appConf
      .set(config.EXECUTOR_CORES, 2)
      .set(config.CORES_MAX, 8))
    val appId = sc.applicationId
    eventually(timeout(10.seconds), interval(10.millis)) {
      val apps = getApplications()
      assert(apps.size === 1)
      assert(apps.head.id === appId)
      assert(apps.head.executors.size === 4) // 8 cores total
      assert(apps.head.getExecutorLimit === Int.MaxValue)
    }
    // sync executors between the Master and the driver, needed because
    // the driver refuses to kill executors it does not know about
    syncExecutors(sc)
    val executors = getExecutorIds(sc)
    assert(executors.size === 4)
    // kill 2 executors
    assert(sc.killExecutors(executors.take(3)))
    val apps = getApplications()
    assert(apps.head.executors.size === 1)
    // add 2 executors
    assert(sc.requestExecutors(3))
    assert(apps.head.executors.size === 4)
  } {code}
3 did not equal 4
Expected :4
Actual   :3



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org