You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Christopher Piggott <cp...@gmail.com> on 2018/05/02 15:57:37 UTC

Running apps over a VPN

My setup is that I have a spark master (using the spark scheduler) and 32
workers registered with it but they are on a private network.  I can
connect to that private network via OpenVPN.

I would like to be able to run spark applications from a local (on my
desktop) IntelliJ but have them use the remote master/workers.

I thought this would allow that:

    sparkConf.set("spark.submit.deployMode", "cluster")

but when my job runs it still complains that there are not enough
resources/workers.  Connecting to the master, it shows that workers have
been assigned and are in the RUNNING state.  My local spark app doesn't
agree.  It's like the workers were assigned but the PC end doesn't know.

I can use spark-submit.sh but I was really hoping to be able to run Spark
Applications directly from IDEA.  Possible?