You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2016/01/29 22:06:39 UTC

[jira] [Updated] (SPARK-13094) Dataset Aggregators do not work with complex types

     [ https://issues.apache.org/jira/browse/SPARK-13094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Michael Armbrust updated SPARK-13094:
-------------------------------------
    Target Version/s: 1.6.1

> Dataset Aggregators do not work with complex types
> --------------------------------------------------
>
>                 Key: SPARK-13094
>                 URL: https://issues.apache.org/jira/browse/SPARK-13094
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Deenar Toraskar
>
> Dataset aggregators with complex types fail with unable to find encoder for type stored in a Dataset. Though Datasets with these complex types are supported.
> val arraySum = new Aggregator[Seq[Float], Seq[Float],
>   Seq[Float]] with Serializable {
>   def zero: Seq[Float] = Nil
>   // The initial value.
>   def reduce(currentSum: Seq[Float], currentRow: Seq[Float]) =
>     sumArray(currentSum, currentRow)
>   def merge(sum: Seq[Float], row: Seq[Float]) = sumArray(sum, row)
>   def finish(b: Seq[Float]) = b // Return the final result.
>   def sumArray(a: Seq[Float], b: Seq[Float]): Seq[Float] = {
>     (a, b) match {
>       case (Nil, Nil) => Nil
>       case (Nil, row) => row
>       case (sum, Nil) => sum
>       case (sum, row) => (a, b).zipped.map { case (a, b) => a + b }
>     }
>   }
> }.toColumn
> <console>:47: error: Unable to find encoder for type stored in a Dataset.  Primitive types (Int, String, etc) and Product types (case classes) are supported by importing sqlContext.implicits._  Support for serializing other types will be added in future releases.
>        }.toColumn



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org