You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/08/03 07:50:00 UTC

[jira] [Commented] (SPARK-39967) Instead of using the scalar tasksSuccessful, use the successful array to calculate whether the task is completed

    [ https://issues.apache.org/jira/browse/SPARK-39967?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17574567#comment-17574567 ] 

Apache Spark commented on SPARK-39967:
--------------------------------------

User 'smallzhongfeng' has created a pull request for this issue:
https://github.com/apache/spark/pull/37395

> Instead of using the scalar tasksSuccessful, use the successful array to calculate whether the task is completed
> ----------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-39967
>                 URL: https://issues.apache.org/jira/browse/SPARK-39967
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.4.3, 2.4.6
>            Reporter: jingxiong zhong
>            Priority: Critical
>
> When counting the number of successful tasks in the stage of spark, spark uses the indicator of `tasksSuccessful`, but in fact, the success or failure of tasks is based on the array of `successful`. Through the log, it is found that the number of failed tasks counted by `tasksSuccessful` is inconsistent with the number of failures stored in the array of `successful`. We should take `successful` as the standard.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org