You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Sadhan Sood <sa...@gmail.com> on 2014/10/22 21:18:26 UTC

Fwd: Sharing spark context across multiple spark sql cli initializations

We want to run multiple instances of spark sql cli on our yarn cluster.
Each instance of the cli is to be used by a different user. This looks
non-optimal if each user brings up a different cli given how spark works on
yarn by running executor processes (and hence consuming resources) on
worker nodes for the lifetime of the application. So, the right way seems
like to use the same spark context shared across multiple initializations
and running just one spark sql application. Is the understanding correct ?
Is there a way to do it currently ? Seem like it needs some kind of thrift
interface hooked into the cli driver.