You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Arunkumar Pillai <ar...@gmail.com> on 2016/05/31 12:46:00 UTC

Running R codes in sparkR

Hi

I have some basic doubt regarding spark R.

1. can we run R codes in spark using sparkR or some spark functionalities
 that are executed in spark through R.



-- 
Thanks and Regards
        Arun

RE: Running R codes in sparkR

Posted by "Kumar, Saurabh 5. (Nokia - IN/Bangalore)" <sa...@nokia.com>.
Hi Arunkumar,

sparkR has very limited functionality than R , and few of datatypes like 'data table'  in R is not there sparkR . So you need to be check compatibility of you R code carefully with sparkR.

Regards,
Saurabh

-----Original Message-----
From: mylisttech@gmail.com [mailto:mylisttech@gmail.com] 
Sent: Tuesday, May 31, 2016 6:35 PM
To: Arunkumar Pillai <ar...@gmail.com>
Cc: user <us...@spark.apache.org>
Subject: Re: Running R codes in sparkR

Hi Arunkumar ,

Yes , R can be integrated with Spark to give you SparkR. There are a couple of blogs on the net. The Spark dev page has it too.

https://spark.apache.org/docs/latest/sparkr.html



Just remember that all packages of R that you may have worked on in R are not supported in SparkR. There are a good set of R packages in SparkR. 

As I have understood you cannot run sapply etc for example. The constraint being these packages need to be ported/coded for RDD's. The R community as I understand is not very deeply involved with the Spark community. - this I have understood by seeing you tube videos. 





On May 31, 2016, at 18:16, Arunkumar Pillai <ar...@gmail.com> wrote:

> Hi
> 
> I have some basic doubt regarding spark R.
> 
> 1. can we run R codes in spark using sparkR or some spark functionalities  that are executed in spark through R.
> 
> 
> 
> -- 
> Thanks and Regards
>        Arun


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Running R codes in sparkR

Posted by my...@gmail.com.
Hi Arunkumar ,

Yes , R can be integrated with Spark to give you SparkR. There are a couple of blogs on the net. The Spark dev page has it too.

https://spark.apache.org/docs/latest/sparkr.html



Just remember that all packages of R that you may have worked on in R are not supported in SparkR. There are a good set of R packages in SparkR. 

As I have understood you cannot run sapply etc for example. The constraint being these packages need to be ported/coded for RDD's. The R community as I understand is not very deeply involved with the Spark community. - this I have understood by seeing you tube videos. 





On May 31, 2016, at 18:16, Arunkumar Pillai <ar...@gmail.com> wrote:

> Hi
> 
> I have some basic doubt regarding spark R.
> 
> 1. can we run R codes in spark using sparkR or some spark functionalities  that are executed in spark through R.
> 
> 
> 
> -- 
> Thanks and Regards
>        Arun


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org