You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by advancedxy <gi...@git.apache.org> on 2018/05/01 13:54:52 UTC

[GitHub] spark pull request #21165: [Spark-20087][CORE] Attach accumulators / metrics...

Github user advancedxy commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21165#discussion_r185226136
  
    --- Diff: core/src/main/scala/org/apache/spark/TaskEndReason.scala ---
    @@ -212,9 +212,15 @@ case object TaskResultLost extends TaskFailedReason {
      * Task was killed intentionally and needs to be rescheduled.
      */
     @DeveloperApi
    -case class TaskKilled(reason: String) extends TaskFailedReason {
    +case class TaskKilled(
    +    reason: String,
    +    accumUpdates: Seq[AccumulableInfo] = Seq.empty,
    +    private[spark] val accums: Seq[AccumulatorV2[_, _]] = Nil)
    --- End diff --
    
    @cloud-fan After a second look, I don't think we can clean up `ExceptionFailure` unless we can break `JsonProtocol` 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org