You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by sadhen <gi...@git.apache.org> on 2018/04/26 10:26:35 UTC
[GitHub] spark issue #11205: [SPARK-11334][Core] Handle maximum task failure situatio...
Github user sadhen commented on the issue:
https://github.com/apache/spark/pull/11205
@jerryshao I think the 2nd bullet has not been fixed in SPARK-13054.
I use spark 2.1.1, and I still find that finished tasks remain in `private val executorIdToTaskIds = new mutable.HashMap[String, mutable.HashSet[Long]]`
But the numRunningTasks equals 0 since:
```
if (numRunningTasks != 0) {
logWarning("No stages are running, but numRunningTasks != 0")
numRunningTasks = 0
}
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org