You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Jose Rivera-Rubio <jo...@internavenue.com> on 2015/10/08 16:40:15 UTC

Cannot convert RDD of BigDecimal into dataframe

Hi guys,

I have a list of BigDecimal obtained through SparkSQL from some Parquet
files.

list: List[BigDecimal] = List(1015.00, 580.00, 290.00, 1160.00)

When I try to convert them to dataframe to visualize them using the
zeppelin context


val df   = sc.parallelize(list).toDF("list_of_numbers")


 I get the following error:

error: value toDF is not a member of org.apache.spark.rdd.RDD[BigDecimal]

If I map the values into double I get the same error.

Any ideas?

Thanks!

Re: Cannot convert RDD of BigDecimal into dataframe

Posted by moon soo Lee <mo...@apache.org>.
I think you can define a case class and map a list of BigDecimal to list of
your case class.

If you parallelize this list, then toDF will work.

Thanks,
moon
On 2015년 10월 8일 (목) at 오후 4:40 Jose Rivera-Rubio <
jose.rivera@internavenue.com> wrote:

> Hi guys,
>
> I have a list of BigDecimal obtained through SparkSQL from some Parquet
> files.
>
> list: List[BigDecimal] = List(1015.00, 580.00, 290.00, 1160.00)
>
> When I try to convert them to dataframe to visualize them using the
> zeppelin context
>
>
> val df   = sc.parallelize(list).toDF("list_of_numbers")
>
>
>  I get the following error:
>
> error: value toDF is not a member of org.apache.spark.rdd.RDD[BigDecimal]
>
> If I map the values into double I get the same error.
>
> Any ideas?
>
> Thanks!
>