You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "吴志龙 (JIRA)" <ji...@apache.org> on 2018/02/07 02:41:00 UTC
[jira] [Created] (SPARK-23346) Failed tasks reported as success if
the failure reason is not ExceptionFailure
吴志龙 created SPARK-23346:
---------------------------
Summary: Failed tasks reported as success if the failure reason is not ExceptionFailure
Key: SPARK-23346
URL: https://issues.apache.org/jira/browse/SPARK-23346
Project: Spark
Issue Type: Bug
Components: Spark Core, SQL
Affects Versions: 2.2.0
Environment: HADOOP 2.6 + JDK1.8 + SPARK 2.2.0
Reporter: 吴志龙
We have many other failure reasons, such as TaskResultLost,but the status is success. In the web ui, we count non-ExceptionFailure failures as successful tasks, which is highly misleading.
detail message:
Job aborted due to stage failure: Task 0 in stage 7.0 failed 10 times, most recent failure: Lost task 0.9 in stage 7.0 (TID 35, 60.hadoop.com, executor 27): TaskResultLost (result lost from block manager)
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org