You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Renxia Wang <re...@gmail.com> on 2015/10/02 00:42:03 UTC
How to Set Retry Policy in Spark
Hi guys,
I know there is a way to set the number of retry of failed tasks, using
spark.task.maxFailures. what is the default policy for the failed tasks
retry? Is it exponential backoff? My tasks sometimes failed because of
Socket connection timeout/reset, even with retry, some of the tasks will
fail for more than spark.task.maxFailures times.
Thanks,
Zhique
Re: How to Set Retry Policy in Spark
Posted by Renxia Wang <re...@gmail.com>.
Additional Info: I am running Spark on YARN.
2015-10-01 15:42 GMT-07:00 Renxia Wang <re...@gmail.com>:
> Hi guys,
>
> I know there is a way to set the number of retry of failed tasks, using
> spark.task.maxFailures. what is the default policy for the failed tasks
> retry? Is it exponential backoff? My tasks sometimes failed because of
> Socket connection timeout/reset, even with retry, some of the tasks will
> fail for more than spark.task.maxFailures times.
>
> Thanks,
>
> Zhique
>