You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Evgenii Morozov <ev...@gmail.com> on 2016/10/07 16:12:28 UTC

Is there a way to pause spark job

Hi!

We’re training few RandomForest models and we wonder if there is a way to pause one particular model (we use our own web-service and we train those from different application threads)? I could do that manually in the middle of my own RDD processing (between the actions) if that would be required, but as far as the control is in RandomForest class, I think it’s not possible. Or may be there is a way?
Could you, please, shed some light on this?

Another question, whether it’s a valid to kill running application thread to stop the spark job to restart it later? 
Would this be risky from the stability point of view?

Apache spark 1.6.1 is in use.

Thank you a lot in advance.
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org