You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Nikhil Goyal <no...@gmail.com> on 2019/10/03 04:19:48 UTC
Understanding deploy mode config
Hi all,
In a pyspark application is the python process the driver or spark will
start a new driver process? If it is the same as driver then how does
specifying "spark.submit.deployMode" as "cluster" in spark conf would come
in use.
conf = SparkConf()
.setMaster("yarn")
.set("spark.submit.deployMode", "cluster")
sc = SparkContext(conf)
Is the spark context being created on application master or on the machine
where this python process is being run?
Thanks
Nikhil