You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Minquan Xu <mx...@smg.com> on 2017/06/05 17:33:32 UTC

how to configure livy.sparkr to run job across cluster nodes

Hi All,

My livy sparkr appears to run jobs in local mode, i.g, the jobs are not in yarn resource manager UI, while spark.pyspark and Jupyer Toree SperkR jobs run across cluster nodes.

Platform: hortonworks hdp 2.6, zeppelin version 0.70, os: linux 2.6.

In the zeppelin interpreter, livy.spark.master is set as yarn-client.
Any suggestion is appreciated.


%livy.sparkr
livy.spark.master

yarn-client


%spark.pyspark
Yarn ResourceManager UI
[cid:image002.png@01D2DDF4.C8E56040]


Jupyter toree SparkR
[cid:image001.png@01D2DDF4.69621500]

Thanks,

Minquan

#####################################################################################
This email and any attachments thereto may contain private, confidential, 
and privileged material for the sole use of the intended recipient. Any review,
copying, or distribution of this email (or any attachments thereto) by others is 
strictly prohibited. If you are not the intended recipient, please contact the sender 
immediately and permanently delete the original and any copies of this email and any
attachments thereto.
#####################################################################################