You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@livy.apache.org by Praveen Muthusamy <pr...@gmail.com> on 2019/02/21 08:47:01 UTC
Run spark application in different clusters
Hi,
Currently HADOOP_CONF_DIR and YARN_CONF_DIR are used to find out the yarn
on which the application will be run.
Can a single instance of livy server interface multiple yarn/hadoop
clusters? Is it possible to do spark submit to two different hadoop
clusters from a single livy server?
Regards,
Praveen M
Re: Run spark application in different clusters
Posted by Meisam Fathi <me...@gmail.com>.
Currently this feature is not available in Livy.
The values of HADOOP_CONF_DIR and YARN_CONF_DIR are read and set when Livy
starts and never change.
Thanks,
Meisam
On Thu, Feb 21, 2019, 12:47 AM Praveen Muthusamy <pr...@gmail.com>
wrote:
> Hi,
>
> Currently HADOOP_CONF_DIR and YARN_CONF_DIR are used to find out the yarn
> on which the application will be run.
> Can a single instance of livy server interface multiple yarn/hadoop
> clusters? Is it possible to do spark submit to two different hadoop
> clusters from a single livy server?
>
> Regards,
> Praveen M
>