You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Debasish Das <de...@gmail.com> on 2014/01/02 22:38:15 UTC
Standalone spark cluster dead nodes
Hi,
I have been running standalone spark cluster but sometimes I do see dead
nodes. The physical machines are not dead but the JVM worker dies.
Is there a methodology which automatically restart worker JVM if it dies ?
Thanks.
Deb
Re: Standalone spark cluster dead nodes
Posted by Andrew Ash <an...@andrewash.com>.
Do you have stacktraces or other errors for the workers' deaths?
This is a great case for using the Tanuki service wrapper that can be
configured to automatically restart JVMs that die.
Andrew
On Thu, Jan 2, 2014 at 4:38 PM, Debasish Das <de...@gmail.com>wrote:
> Hi,
>
> I have been running standalone spark cluster but sometimes I do see dead
> nodes. The physical machines are not dead but the JVM worker dies.
>
> Is there a methodology which automatically restart worker JVM if it dies ?
>
> Thanks.
> Deb
>
>