You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "胡振宇 (Jira)" <ji...@apache.org> on 2021/08/25 12:19:00 UTC

[jira] [Created] (SPARK-36584) ExecutorMonitor#onBlockUpdated will receive event from driver

胡振宇 created SPARK-36584:
---------------------------

             Summary: ExecutorMonitor#onBlockUpdated will receive event from driver
                 Key: SPARK-36584
                 URL: https://issues.apache.org/jira/browse/SPARK-36584
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.1.2
         Environment: Spark 3.1.2
            Reporter: 胡振宇


When driver broadcast object, it will send the [SparkListenerBlockUpdated](https://github.com/apache/spark/blob/df0ec56723f0b47c3629055fa7a8c63bb4285147/core/src/main/scala/org/apache/spark/scheduler/SparkListener.scala#L228) event. [ExecutorMonitor#onBlockUpdated](https://github.com/apache/spark/blob/df0ec56723f0b47c3629055fa7a8c63bb4285147/core/src/main/scala/org/apache/spark/scheduler/dynalloc/ExecutorMonitor.scala#L380) receives and handles the event, in this method, it calls [ensureExecutorIsTracked](https://github.com/apache/spark/blob/df0ec56723f0b47c3629055fa7a8c63bb4285147/core/src/main/scala/org/apache/spark/scheduler/dynalloc/ExecutorMonitor.scala#L489) to put driver in `executors` variable, but in my understanding, `ExecutorMonitor` should only monitor Executor, not Driver. Moreover, adding a `driver` to the `executors` will affect the calculation of [ExecutorAllocationManager#removeExecutors](https://github.com/apache/spark/blob/df0ec56723f0b47c3629055fa7a8c63bb4285147/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala#L552), and the driver will occupy the count of  `executors`

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org