You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jeff Puro <jp...@mustwin.com> on 2016/09/23 22:21:15 UTC

Running Spark master/slave instances in non Daemon mode

Hi,

I recently tried deploying Spark master and slave instances to container
based environments such as Docker, Nomad etc. There are two issues that
I've found with how the startup scripts work. The sbin/start-master.sh and
sbin/start-slave.sh start a daemon by default, but this isn't as compatible
with container deployments as one would think. The first issue is that the
daemon runs in the background and some container solutions require the apps
to run in the foreground or they consider the application to not be running
and they may close down the task. The second issue is that logs don't seem
to get integrated with the logging mechanism in the container solution.
What is the possibility of adding additional flags or startup scripts for
supporting Spark to run in the foreground? It would be great if a flag
like SPARK_NO_DAEMONIZE could be added or another script for foreground
execution.

Regards,

Jeff