You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "andy.petrella@gmail.com" <an...@gmail.com> on 2014/01/16 23:47:02 UTC

Re : SparkR developer release

Cool that's awesome and something I'll surely investigate in the coming weeks.
Great job!

Envoyé depuis mon HTC

----- Reply message -----
De : "Shivaram Venkataraman" <sh...@eecs.berkeley.edu>
Pour : <de...@spark.incubator.apache.org>, <us...@spark.incubator.apache.org>
Cc : "Zongheng Yang" <zo...@gmail.com>, "Matthew L Massie" <ma...@berkeley.edu>
Objet : SparkR developer release
Date : jeu., janv. 16, 2014 23:14


I'm happy to announce the developer preview of SparkR, an R frontend
for Spark. SparkR presents Spark's API in R and allows you to write
code in R and run the computation on a Spark cluster. You can try out
SparkR today by installing it from our github repo at
https://github.com/amplab-extras/SparkR-pkg .

Right now SparkR is available as a standalone package that can be
installed to run on an existing Spark installation. Note that SparkR
requires Spark >= 0.9 and the default build uses the recent 0.9
release candidate. In the future we will consider merging this with
Apache Spark.

More details about SparkR and examples of SparkR code can be found at
http://amplab-extras.github.io/SparkR-pkg. I would like to thank
Zongheng Yang, Matei Zaharia and Matt Massie for their contributions
and help in developing SparkR.

Comments and pull requests are welcome on github.

Thanks
Shivaram