You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by yh18190 <yh...@gmail.com> on 2014/03/30 12:22:54 UTC

Can we convert scala.collection.ArrayBuffer[(Int,Double)] to org.spark.RDD[(Int,Double])

Hi,

Can we convert directly scala collection to spark RDD data type without
using parellize method?
Is their any way to create custom converted RDD datatype from scala type
using some typecast like that?

Please suggest me....



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Can-we-convert-scala-collection-ArrayBuffer-Int-Double-to-org-spark-RDD-Int-Double-tp3486.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Can we convert scala.collection.ArrayBuffer[(Int,Double)] to org.spark.RDD[(Int,Double])

Posted by Mayur Rustagi <ma...@gmail.com>.
The scala object needs to be sent to workers to be used as a RDD,
parallalize is a way to do that. What are you looking to do?
You can serialize the scala object to hdfs/disk & load it from thr
Regards
Mayur

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



On Sun, Mar 30, 2014 at 6:22 AM, yh18190 <yh...@gmail.com> wrote:

> Hi,
>
> Can we convert directly scala collection to spark RDD data type without
> using parellize method?
> Is their any way to create custom converted RDD datatype from scala type
> using some typecast like that?
>
> Please suggest me....
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Can-we-convert-scala-collection-ArrayBuffer-Int-Double-to-org-spark-RDD-Int-Double-tp3486.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>