You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by guxiaobo1982 <gu...@qq.com> on 2014/01/02 05:55:16 UTC

Where can I find more information about the R interface for Spark?

I read the good news from here:
  
 http://blog.revolutionanalytics.com/2013/12/apache-spark.html
  
  
  
 >> Currently, Spark supports programming interfaces for Scala, Java and Python. >> For R users, there is good news: an R interface is in the works and under >>development by the team at AMPLab; our sources tell us  this is expected to be >>released in the first half of 2014.
  
  
 Regards,
  
 Xiaobo Gu

Re: Where can I find more information about the R interface forSpark?

Posted by Ted Yu <yu...@gmail.com>.
Please follow SPARK-5654

On Wed, Mar 4, 2015 at 7:22 PM, Haopu Wang <HW...@qilinsoft.com> wrote:

>  Thanks, it's an active project.
>
>
>
> Will it be released with Spark 1.3.0?
>
>
>  ------------------------------
>
> *From:* 鹰 [mailto:980548079@qq.com]
> *Sent:* Thursday, March 05, 2015 11:19 AM
> *To:* Haopu Wang; user
> *Subject:* Re: Where can I find more information about the R interface
> forSpark?
>
>
>
> you can search SparkR on google or search it on github
>

RE: Where can I find more information about the R interface forSpark?

Posted by Haopu Wang <HW...@qilinsoft.com>.
Thanks, it's an active project.

 

Will it be released with Spark 1.3.0?

 

________________________________

From: 鹰 [mailto:980548079@qq.com] 
Sent: Thursday, March 05, 2015 11:19 AM
To: Haopu Wang; user
Subject: Re: Where can I find more information about the R interface forSpark?

 

you can search SparkR on google or search it on github 


Re: Where can I find more information about the R interface forSpark?

Posted by 鹰 <98...@qq.com>.
you can search SparkR on google or search it on github

Re: Where can I find more information about the R interface for Spark?

Posted by haopu <hw...@qilinsoft.com>.
Do you have any update on SparkR?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Where-can-I-find-more-information-about-the-R-interface-for-Spark-tp155p21922.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Where can I find more information about the R interface for Spark?

Posted by Shivaram Venkataraman <sh...@gmail.com>.
Hi

As Zongheng mentioned we have been working on a R frontend that is
similar in spirit to PySpark i.e it allows you to create and
manipulate RDDs from R.

Our plan is to have an alpha version that the community can try out in
the next few weeks -- We will ping the user list once its ready.

Thanks
Shivaram

On Thu, Jan 2, 2014 at 6:14 PM, Zongheng Yang <zo...@gmail.com> wrote:
> Hi Shay,
>
> Good to know there are interests in the R interface! I am not sure
> about the specific release timeframe. Here's what should probably be
> available in the initial release:
>
> - common RDD transformations (map() / lapply(), flatMap(),
> lapplyPartition(), common suffle functions, ...) and actions (count()
> / length(), collect(), ...)
> - support for shipping R closures; support for pairwise RDDs
>
> The features are currently by no means complete yet, but the hope is
> that the community could also contribute to it. So far we are able to
> port some Spark examples and have them running, such as pi estimation,
> logistic regression, and wordcount.
>
> Zongheng
>
> On Fri, Jan 3, 2014 at 1:36 AM, Shay Seng <sh...@1618labs.com> wrote:
>> I've been using JRI to communicate with R from Spark, with some utils to
>> convert from Scala data types into R datatypes/dataframes etc.
>> http://www.rforge.net/JRI/
>> I've been using mapPartitions to push R closures thru JRI and collecting
>> back the results in Spark. This works reasonably well, though no where as
>> nicely as straight Spark -- as expected.
>>
>> I've also been using JavaGD to allow me to use ggplot to visualize data from
>> Spark -> R, that IMO, is much nicer than anything Java/scala can provide.
>>
>>
>> It's interesting to hear of  the R  interface work at AMPLab, anyone there
>> care to elaborate what will be available and the limitations and possible
>> the timeframe?
>>
>>
>> tks
>> shay
>>
>>
>>
>> On Wed, Jan 1, 2014 at 8:55 PM, guxiaobo1982 <gu...@qq.com> wrote:
>>>
>>> I read the good news from here:
>>>
>>> http://blog.revolutionanalytics.com/2013/12/apache-spark.html
>>>
>>>
>>>
>>> >> Currently, Spark supports programming interfaces for Scala, Java and
>>> >> Python. >> For R users, there is good news: an R interface is in the works
>>> >> and under >>development by the team at AMPLab; our sources tell us  this is
>>> >> expected to be >>released in the first half of 2014.
>>>
>>>
>>> Regards,
>>>
>>> Xiaobo Gu
>>
>>

Re: Where can I find more information about the R interface for Spark?

Posted by Zongheng Yang <zo...@gmail.com>.
Hi Shay,

Good to know there are interests in the R interface! I am not sure
about the specific release timeframe. Here's what should probably be
available in the initial release:

- common RDD transformations (map() / lapply(), flatMap(),
lapplyPartition(), common suffle functions, ...) and actions (count()
/ length(), collect(), ...)
- support for shipping R closures; support for pairwise RDDs

The features are currently by no means complete yet, but the hope is
that the community could also contribute to it. So far we are able to
port some Spark examples and have them running, such as pi estimation,
logistic regression, and wordcount.

Zongheng

On Fri, Jan 3, 2014 at 1:36 AM, Shay Seng <sh...@1618labs.com> wrote:
> I've been using JRI to communicate with R from Spark, with some utils to
> convert from Scala data types into R datatypes/dataframes etc.
> http://www.rforge.net/JRI/
> I've been using mapPartitions to push R closures thru JRI and collecting
> back the results in Spark. This works reasonably well, though no where as
> nicely as straight Spark -- as expected.
>
> I've also been using JavaGD to allow me to use ggplot to visualize data from
> Spark -> R, that IMO, is much nicer than anything Java/scala can provide.
>
>
> It's interesting to hear of  the R  interface work at AMPLab, anyone there
> care to elaborate what will be available and the limitations and possible
> the timeframe?
>
>
> tks
> shay
>
>
>
> On Wed, Jan 1, 2014 at 8:55 PM, guxiaobo1982 <gu...@qq.com> wrote:
>>
>> I read the good news from here:
>>
>> http://blog.revolutionanalytics.com/2013/12/apache-spark.html
>>
>>
>>
>> >> Currently, Spark supports programming interfaces for Scala, Java and
>> >> Python. >> For R users, there is good news: an R interface is in the works
>> >> and under >>development by the team at AMPLab; our sources tell us  this is
>> >> expected to be >>released in the first half of 2014.
>>
>>
>> Regards,
>>
>> Xiaobo Gu
>
>

Re: Where can I find more information about the R interface for Spark?

Posted by Shay Seng <sh...@1618labs.com>.
I've been using JRI to communicate with R from Spark, with some utils to
convert from Scala data types into R datatypes/dataframes etc.
http://www.rforge.net/JRI/
I've been using mapPartitions to push R closures thru JRI and collecting
back the results in Spark. This works reasonably well, though no where as
nicely as straight Spark -- as expected.

I've also been using JavaGD to allow me to use ggplot to visualize data
from Spark -> R, that IMO, is much nicer than anything Java/scala can
provide.


It's interesting to hear of  the R  interface work at AMPLab, anyone there
care to elaborate what will be available and the limitations and possible
the timeframe?


tks
shay



On Wed, Jan 1, 2014 at 8:55 PM, guxiaobo1982 <gu...@qq.com> wrote:

> I read the good news from here:
>
> http://blog.revolutionanalytics.com/2013/12/apache-spark.html
>
>
>
> >> Currently, Spark supports programming interfaces for Scala, Java<http://spark.incubator.apache.org/docs/latest/java-programming-guide.html>and
> Python<http://spark.incubator.apache.org/docs/latest/python-programming-guide.html>.
> >> For R users, there is good news: an R interface is in the works and
> under >>development by the team at AMPLab; our sources tell us  this is
> expected to be >>released in the first half of 2014.
>
>
> Regards,
>
> Xiaobo Gu
>