You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by CondyZhou <zh...@126.com> on 2017/12/21 03:31:40 UTC

keep sparkContext alive and wait for next job just like spark-shell

Hi ,All.       i am confused of how can i keep a sparkContext alive. Just in
the situation that we write a sql query on a web and backend we init a
sparkContext then submit the spark jobs. However the question is everytime
we run the query string,spark with request the resources from yarn.It is
painful to waste a lot of time on init the sparkContxt.     So i think about
the way to run the spark job that when the query finnished the context will
not ended and do not release the resource on yarn just like the spark-shell(
i find the spark-shell will keep the resources when started).      Is there
any idea,please give me  a tutor . Thanks for all !



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/