You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Dirceu Semighini Filho <di...@gmail.com> on 2015/03/02 14:16:50 UTC

Re: How to create a Row from a List or Array in Spark using Scala

You can use the parallelize method:

val data = List(
  Row(1, 5, "vlr1", 10.5),
  Row(2, 1, "vl3", 0.1),
  Row(3, 8, "vl3", 10.0),
  Row(4, 1, "vl4", 1.0))
val rdd = sc.parallelize(data)

Here I'm using a list of Rows, but you could use it with a list of
other kind of object, like this:


val x = sc.parallelize(List("a","b","c"))

Where x is an RDD[String] and sc is the spark context.


Regards,

Dirceu


2015-02-28 5:37 GMT-03:00 DEVAN M.S. <ms...@gmail.com>:

>   In scala API its there, Row.fromSeq(ARRAY), I dnt know much more
> about java api
>
>
>
> Devan M.S. | Research Associate | Cyber Security | AMRITA VISHWA
> VIDYAPEETHAM | Amritapuri | Cell +919946535290 |
>
>
> On Sat, Feb 28, 2015 at 1:28 PM, r7raul1984@163.com <r7...@163.com>
> wrote:
>
> > import org.apache.spark.sql.catalyst.expressions._
> >
> > val values: JavaArrayList[Any] = new JavaArrayList()
> > computedValues = Row(values.get(0),values.get(1)) //It is not good by use
> > get(index).  How to create a Row from a List or Array in Spark using
> Scala .
> >
> >
> >
> > r7raul1984@163.com
> >
>