You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by squito <gi...@git.apache.org> on 2018/02/02 22:04:16 UTC
[GitHub] spark pull request #17422: [SPARK-20087] Attach accumulators / metrics to 'T...
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/17422#discussion_r165772055
--- Diff: core/src/main/scala/org/apache/spark/TaskEndReason.scala ---
@@ -212,9 +212,19 @@ case object TaskResultLost extends TaskFailedReason {
* Task was killed intentionally and needs to be rescheduled.
*/
@DeveloperApi
-case class TaskKilled(reason: String) extends TaskFailedReason {
- override def toErrorString: String = s"TaskKilled ($reason)"
+case class TaskKilled(
+ reason: String,
+ accumUpdates: Seq[AccumulableInfo] = Seq.empty,
+ private[spark] var accums: Seq[AccumulatorV2[_, _]] = Nil)
+ extends TaskFailedReason {
+
+ override def toErrorString: String = "TaskKilled ($reason)"
override def countTowardsTaskFailures: Boolean = false
+
+ private[spark] def withAccums(accums: Seq[AccumulatorV2[_, _]]): TaskKilled = {
--- End diff --
I don't think this method is really necessary at all, you could just pass it in the constructor in the places its used.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org