You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by cloud-fan <gi...@git.apache.org> on 2018/04/28 03:39:10 UTC
[GitHub] spark pull request #21165: [Spark-20087][CORE] Attach accumulators / metrics...
Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/21165#discussion_r184838512
--- Diff: core/src/main/scala/org/apache/spark/TaskEndReason.scala ---
@@ -212,9 +212,15 @@ case object TaskResultLost extends TaskFailedReason {
* Task was killed intentionally and needs to be rescheduled.
*/
@DeveloperApi
-case class TaskKilled(reason: String) extends TaskFailedReason {
+case class TaskKilled(
+ reason: String,
+ accumUpdates: Seq[AccumulableInfo] = Seq.empty,
+ private[spark] val accums: Seq[AccumulatorV2[_, _]] = Nil)
--- End diff --
Previously we use `AccumulableInfo` to expose accumulator information to end users. Now `AccumulatorV2` is already a public classs and we don't need to do it anymore, I think we can just do
```
case class TaskKilled(reason: String, accums: Seq[AccumulatorV2[_, _]])
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org