You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Stuti Awasthi <st...@hcl.com> on 2014/06/18 07:59:48 UTC

SparkR Installation

Hi All,

I wanted to try SparkR. Do we need preinstalled R on all the nodes of the cluster before installing SparkR package ? Please guide me how to proceed with this. As of now, I work with R only on single node.
Please suggest

Thanks
Stuti Awasthi


::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------

The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.

----------------------------------------------------------------------------------------------------------------------------------------------------

Re: SparkR Installation

Posted by Zongheng Yang <zo...@gmail.com>.
Hi Stuti,

Yes, you do need to install R on all nodes. Furthermore the rJava
library is also required, which can be installed simply using
'install.packages("rJava")' in the R shell. Some more installation
instructions after that step can be found in the README here:
https://github.com/amplab-extras/SparkR-pkg

Zongheng

On Tue, Jun 17, 2014 at 10:59 PM, Stuti Awasthi <st...@hcl.com> wrote:
> Hi All,
>
>
>
> I wanted to try SparkR. Do we need preinstalled R on all the nodes of the
> cluster before installing SparkR package ? Please guide me how to proceed
> with this. As of now, I work with R only on single node.
>
> Please suggest
>
>
>
> Thanks
>
> Stuti Awasthi
>
>
>
> ::DISCLAIMER::
> ----------------------------------------------------------------------------------------------------------------------------------------------------
>
> The contents of this e-mail and any attachment(s) are confidential and
> intended for the named recipient(s) only.
> E-mail transmission is not guaranteed to be secure or error-free as
> information could be intercepted, corrupted,
> lost, destroyed, arrive late or incomplete, or may contain viruses in
> transmission. The e mail and its contents
> (with or without referred errors) shall therefore not attach any liability
> on the originator or HCL or its affiliates.
> Views or opinions, if any, presented in this email are solely those of the
> author and may not necessarily reflect the
> views or opinions of HCL or its affiliates. Any form of reproduction,
> dissemination, copying, disclosure, modification,
> distribution and / or publication of this message without the prior written
> consent of authorized representative of
> HCL is strictly prohibited. If you have received this email in error please
> delete it and notify the sender immediately.
> Before opening any email and/or attachments, please check them for viruses
> and other defects.
>
> ----------------------------------------------------------------------------------------------------------------------------------------------------