You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Paul R <pj...@gmail.com> on 2016/09/02 15:26:26 UTC

sparkR array type not supported

Hi there,

I’ve noticed the following command in sparkR 

>>> field = structField(“x”, “array”)

Throws this error

>>> Error in checkType(type) : Unsupported type for SparkDataframe: array

Was wondering if this is a bug as the documentation says “array” should be implemented

Thanks 
---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: sparkR array type not supported

Posted by Shivaram Venkataraman <sh...@eecs.berkeley.edu>.
I think it needs a type for the elements in the array. For example

f <- structField("x", "array<integer>")

Thanks
Shivaram

On Fri, Sep 2, 2016 at 8:26 AM, Paul R <pj...@gmail.com> wrote:
> Hi there,
>
> I’ve noticed the following command in sparkR
>
>>>> field = structField(“x”, “array”)
>
> Throws this error
>
>>>> Error in checkType(type) : Unsupported type for SparkDataframe: array
>
> Was wondering if this is a bug as the documentation says “array” should be implemented
>
> Thanks
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org