You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/09/28 20:56:00 UTC

[jira] [Commented] (SPARK-25568) Continue to update the remaining accumulators when failing to update one accumulator

    [ https://issues.apache.org/jira/browse/SPARK-25568?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16632547#comment-16632547 ] 

Apache Spark commented on SPARK-25568:
--------------------------------------

User 'zsxwing' has created a pull request for this issue:
https://github.com/apache/spark/pull/22586

> Continue to update the remaining accumulators when failing to update one accumulator
> ------------------------------------------------------------------------------------
>
>                 Key: SPARK-25568
>                 URL: https://issues.apache.org/jira/browse/SPARK-25568
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.2, 2.4.0
>            Reporter: Shixiong Zhu
>            Assignee: Shixiong Zhu
>            Priority: Major
>
> Currently when failing to update an accumulator, DAGScheduler.updateAccumulators will skip the remaining accumulators. We should try to update the remaining accumulators if possible so that they can still report correct values.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org