You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Shivaram Venkataraman <sh...@eecs.berkeley.edu> on 2014/01/16 23:14:40 UTC

SparkR developer release

I'm happy to announce the developer preview of SparkR, an R frontend
for Spark. SparkR presents Spark's API in R and allows you to write
code in R and run the computation on a Spark cluster. You can try out
SparkR today by installing it from our github repo at
https://github.com/amplab-extras/SparkR-pkg .

Right now SparkR is available as a standalone package that can be
installed to run on an existing Spark installation. Note that SparkR
requires Spark >= 0.9 and the default build uses the recent 0.9
release candidate. In the future we will consider merging this with
Apache Spark.

More details about SparkR and examples of SparkR code can be found at
http://amplab-extras.github.io/SparkR-pkg. I would like to thank
Zongheng Yang, Matei Zaharia and Matt Massie for their contributions
and help in developing SparkR.

Comments and pull requests are welcome on github.

Thanks
Shivaram

Re: SparkR developer release

Posted by "Kk.gmail" <kh...@gmail.com>.
Great news . A lot of data scientists are looking for this .



> On Jan 16, 2014, at 2:14 PM, Shivaram Venkataraman <sh...@eecs.berkeley.edu> wrote:
> 
> I'm happy to announce the developer preview of SparkR, an R frontend
> for Spark. SparkR presents Spark's API in R and allows you to write
> code in R and run the computation on a Spark cluster. You can try out
> SparkR today by installing it from our github repo at
> https://github.com/amplab-extras/SparkR-pkg .
> 
> Right now SparkR is available as a standalone package that can be
> installed to run on an existing Spark installation. Note that SparkR
> requires Spark >= 0.9 and the default build uses the recent 0.9
> release candidate. In the future we will consider merging this with
> Apache Spark.
> 
> More details about SparkR and examples of SparkR code can be found at
> http://amplab-extras.github.io/SparkR-pkg. I would like to thank
> Zongheng Yang, Matei Zaharia and Matt Massie for their contributions
> and help in developing SparkR.
> 
> Comments and pull requests are welcome on github.
> 
> Thanks
> Shivaram

Re: SparkR developer release

Posted by "Kk.gmail" <kh...@gmail.com>.
Great news . A lot of data scientists are looking for this .



> On Jan 16, 2014, at 2:14 PM, Shivaram Venkataraman <sh...@eecs.berkeley.edu> wrote:
> 
> I'm happy to announce the developer preview of SparkR, an R frontend
> for Spark. SparkR presents Spark's API in R and allows you to write
> code in R and run the computation on a Spark cluster. You can try out
> SparkR today by installing it from our github repo at
> https://github.com/amplab-extras/SparkR-pkg .
> 
> Right now SparkR is available as a standalone package that can be
> installed to run on an existing Spark installation. Note that SparkR
> requires Spark >= 0.9 and the default build uses the recent 0.9
> release candidate. In the future we will consider merging this with
> Apache Spark.
> 
> More details about SparkR and examples of SparkR code can be found at
> http://amplab-extras.github.io/SparkR-pkg. I would like to thank
> Zongheng Yang, Matei Zaharia and Matt Massie for their contributions
> and help in developing SparkR.
> 
> Comments and pull requests are welcome on github.
> 
> Thanks
> Shivaram

Re: SparkR developer release

Posted by Chester Chen <ch...@yahoo.com>.
This is something I am looking for, definitely will take a look
Chester

Sent from my iPhone

On Jan 16, 2014, at 2:14 PM, Shivaram Venkataraman <sh...@eecs.berkeley.edu> wrote:

> I'm happy to announce the developer preview of SparkR, an R frontend
> for Spark. SparkR presents Spark's API in R and allows you to write
> code in R and run the computation on a Spark cluster. You can try out
> SparkR today by installing it from our github repo at
> https://github.com/amplab-extras/SparkR-pkg .
> 
> Right now SparkR is available as a standalone package that can be
> installed to run on an existing Spark installation. Note that SparkR
> requires Spark >= 0.9 and the default build uses the recent 0.9
> release candidate. In the future we will consider merging this with
> Apache Spark.
> 
> More details about SparkR and examples of SparkR code can be found at
> http://amplab-extras.github.io/SparkR-pkg. I would like to thank
> Zongheng Yang, Matei Zaharia and Matt Massie for their contributions
> and help in developing SparkR.
> 
> Comments and pull requests are welcome on github.
> 
> Thanks
> Shivaram

Re: SparkR developer release

Posted by Chester Chen <ch...@yahoo.com>.
This is something I am looking for, definitely will take a look
Chester

Sent from my iPhone

On Jan 16, 2014, at 2:14 PM, Shivaram Venkataraman <sh...@eecs.berkeley.edu> wrote:

> I'm happy to announce the developer preview of SparkR, an R frontend
> for Spark. SparkR presents Spark's API in R and allows you to write
> code in R and run the computation on a Spark cluster. You can try out
> SparkR today by installing it from our github repo at
> https://github.com/amplab-extras/SparkR-pkg .
> 
> Right now SparkR is available as a standalone package that can be
> installed to run on an existing Spark installation. Note that SparkR
> requires Spark >= 0.9 and the default build uses the recent 0.9
> release candidate. In the future we will consider merging this with
> Apache Spark.
> 
> More details about SparkR and examples of SparkR code can be found at
> http://amplab-extras.github.io/SparkR-pkg. I would like to thank
> Zongheng Yang, Matei Zaharia and Matt Massie for their contributions
> and help in developing SparkR.
> 
> Comments and pull requests are welcome on github.
> 
> Thanks
> Shivaram

Re: SparkR developer release

Posted by Raja Pasupuleti <ra...@gmail.com>.
Nice!


On Thu, Jan 16, 2014 at 5:14 PM, Shivaram Venkataraman <
shivaram@eecs.berkeley.edu> wrote:

> I'm happy to announce the developer preview of SparkR, an R frontend
> for Spark. SparkR presents Spark's API in R and allows you to write
> code in R and run the computation on a Spark cluster. You can try out
> SparkR today by installing it from our github repo at
> https://github.com/amplab-extras/SparkR-pkg .
>
> Right now SparkR is available as a standalone package that can be
> installed to run on an existing Spark installation. Note that SparkR
> requires Spark >= 0.9 and the default build uses the recent 0.9
> release candidate. In the future we will consider merging this with
> Apache Spark.
>
> More details about SparkR and examples of SparkR code can be found at
> http://amplab-extras.github.io/SparkR-pkg. I would like to thank
> Zongheng Yang, Matei Zaharia and Matt Massie for their contributions
> and help in developing SparkR.
>
> Comments and pull requests are welcome on github.
>
> Thanks
> Shivaram
>