You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Aureliano Buendia <bu...@gmail.com> on 2014/01/14 02:38:49 UTC

Occasional failed tasks

Hi,

While running a big spark job, in spark web ui I can see a tiny fraction of
failed tasks:

26630/536568 (15 failed)

Since all the tasks are the same, the failed tasks cannot be an application
error. Also, spark log doesn't have any errors.

- Does spark retry these tasks?
- Are these due to something like hardware failure? Is this rate of failed
task something normal, or are they supposed to be absolutely 0?

Re: Occasional failed tasks

Posted by Peng Cheng <pc...@uow.edu.au>.
I think these failed task must got retried automatically if you can't see any
error in your results. Other wise the entire application will throw a
SparkException and abort.

Unfortunately I don't know how to do this, my application always abort.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Occasional-failed-tasks-tp527p7259.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.