You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/05/17 01:24:04 UTC

[jira] [Assigned] (SPARK-19089) Support nested arrays/seqs in Datasets

     [ https://issues.apache.org/jira/browse/SPARK-19089?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-19089:
------------------------------------

    Assignee:     (was: Apache Spark)

> Support nested arrays/seqs in Datasets
> --------------------------------------
>
>                 Key: SPARK-19089
>                 URL: https://issues.apache.org/jira/browse/SPARK-19089
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Michal Šenkýř
>            Priority: Minor
>
> Nested arrays and seqs are not supported in Datasets:
> {code}
> scala> spark.createDataset(Seq(Array(Array(1))))
> <console>:24: error: Unable to find encoder for type stored in a Dataset.  Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._  Support for serializing other types will be added in future releases.
>        spark.createDataset(Seq(Array(Array(1))))
>                           ^
> scala> Seq(Array(Array(1))).toDS()
> <console>:24: error: value toDS is not a member of Seq[Array[Array[Int]]]
>        Seq(Array(Array(1))).toDS()
> scala> spark.createDataset(Seq(Seq(Seq(1))))
> <console>:24: error: Unable to find encoder for type stored in a Dataset.  Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._  Support for serializing other types will be added in future releases.
>        spark.createDataset(Seq(Seq(Seq(1))))
> scala> Seq(Seq(Seq(1))).toDS()
> <console>:24: error: value toDS is not a member of Seq[Seq[Seq[Int]]]
>        Seq(Seq(Seq(1))).toDS()
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org