You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Corey Nolet <cj...@gmail.com> on 2014/12/29 21:38:49 UTC

Submit spark jobs inside web application

I want to have a SparkContext inside of a  web application running in Jetty
that i can use to submit jobs to a cluster of Spark executors. I am running
YARN.

Ultimately, I would love it if I could just use somethjing like
SparkSubmit.main() to allocate a bunch of resoruces in YARN when the webapp
is deployed and de-allocate them whent he webapp goes down. The problem is,
SparkSubmit, just like the spark-submit script, requires that I have a JAR
that can be deployed to all the nodes right away. Perhaps I'm not
understanding enough about how SPark works internally, but i thought it was
similar to the Scala shell where closures can be shipped off to executors
at runtime.

I was wondering if another approach would be to start up an app in
yarn-client mode that does nothing and then have my web-application connect
to that master.

Any ideas?


Thanks.