You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2020/09/18 21:10:30 UTC

[spark] branch branch-3.0 updated: [SPARK-32898][CORE] Fix wrong executorRunTime when task killed before real start

This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
     new 03fb144  [SPARK-32898][CORE] Fix wrong executorRunTime when task killed before real start
03fb144 is described below

commit 03fb144d6a09829e39fa8ae39b7522fa2e00b248
Author: yi.wu <yi...@databricks.com>
AuthorDate: Fri Sep 18 14:02:14 2020 -0700

    [SPARK-32898][CORE] Fix wrong executorRunTime when task killed before real start
    
    ### What changes were proposed in this pull request?
    
    Only calculate the executorRunTime when taskStartTimeNs > 0. Otherwise, set executorRunTime to 0.
    
    ### Why are the changes needed?
    
    bug fix.
    
    It's possible that a task be killed (e.g., by another successful attempt) before it reaches "taskStartTimeNs = System.nanoTime()". In this case, taskStartTimeNs is still 0 since it hasn't been really initialized. And we will get the wrong executorRunTime by calculating System.nanoTime() - taskStartTimeNs.
    
    ### Does this PR introduce _any_ user-facing change?
    
    Yes, users will see the correct executorRunTime.
    
    ### How was this patch tested?
    
    Pass existing tests.
    
    Closes #29789 from Ngone51/fix-SPARK-32898.
    
    Authored-by: yi.wu <yi...@databricks.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
    (cherry picked from commit f1dc479d39a6f05df7155008d8ec26dff42bb06c)
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 core/src/main/scala/org/apache/spark/executor/Executor.scala | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/core/src/main/scala/org/apache/spark/executor/Executor.scala b/core/src/main/scala/org/apache/spark/executor/Executor.scala
index e9f1d9c..83e4a73 100644
--- a/core/src/main/scala/org/apache/spark/executor/Executor.scala
+++ b/core/src/main/scala/org/apache/spark/executor/Executor.scala
@@ -367,7 +367,9 @@ private[spark] class Executor(
       // Report executor runtime and JVM gc time
       Option(task).foreach(t => {
         t.metrics.setExecutorRunTime(TimeUnit.NANOSECONDS.toMillis(
-          System.nanoTime() - taskStartTimeNs))
+          // SPARK-32898: it's possible that a task is killed when taskStartTimeNs has the initial
+          // value(=0) still. In this case, the executorRunTime should be considered as 0.
+          if (taskStartTimeNs > 0) System.nanoTime() - taskStartTimeNs else 0))
         t.metrics.setJvmGCTime(computeTotalGcTime() - startGCTime)
       })
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org