You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by M Singh <ma...@yahoo.com.INVALID> on 2018/03/23 16:19:01 UTC

Apache Spark Structured Streaming - How to keep executor alive.

Hi:
I am working on spark structured streaming (2.2.1) with kafka and want 100 executors to be alive. I set spark.executor.instances to be 100.  The process starts running with 100 executors but after some time only a few remain which causes backlog of events from kafka.  
I thought I saw a setting to keep the executors from being killed.  However, I am not able to find that configuration in spark docs.  If anyone knows that setting, please let me know.
Thanks