You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Samy Dindane <sa...@dindane.com> on 2016/10/12 16:36:26 UTC

How to prevent having more than one instance of a specific job running on the cluster

Hi,

I'd like a specific job to fail if there's another instance of it already running on the cluster (Spark Standalone in my case).
How to achieve this?

Thank you.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org