You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Koert Kuipers <ko...@tresata.com> on 2016/09/25 22:27:51 UTC

Re: ArrayType support in Spark SQL

not pretty but this works:

import org.apache.spark.sql.functions.udf
df.withColumn("array", sqlf.udf({ () => Seq(1, 2, 3) }).apply())


On Sun, Sep 25, 2016 at 6:13 PM, Jason White <ja...@shopify.com>
wrote:

> It seems that `functions.lit` doesn't support ArrayTypes. To reproduce:
>
> org.apache.spark.sql.functions.lit(2 :: 1 :: Nil)
>
> java.lang.RuntimeException: Unsupported literal type class
> scala.collection.immutable.$colon$colon List(2, 1)
>   at
> org.apache.spark.sql.catalyst.expressions.Literal$.apply(
> literals.scala:59)
>   at org.apache.spark.sql.functions$.lit(functions.scala:101)
>   ... 48 elided
>
> This is about the first thing I tried to do with ArrayTypes in Spark SQL.
> Is
> this usage supported, or on the roadmap?
>
>
>
> --
> View this message in context: http://apache-spark-
> developers-list.1001551.n3.nabble.com/ArrayType-support-
> in-Spark-SQL-tp19063.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>