You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2017/05/22 08:52:04 UTC
[jira] [Resolved] (SPARK-19089) Support nested arrays/seqs in
Datasets
[ https://issues.apache.org/jira/browse/SPARK-19089?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-19089.
---------------------------------
Resolution: Fixed
Fix Version/s: 2.2.0
Issue resolved by pull request 18011
[https://github.com/apache/spark/pull/18011]
> Support nested arrays/seqs in Datasets
> --------------------------------------
>
> Key: SPARK-19089
> URL: https://issues.apache.org/jira/browse/SPARK-19089
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Reporter: Michal Šenkýř
> Priority: Minor
> Fix For: 2.2.0
>
>
> Nested arrays and seqs are not supported in Datasets:
> {code}
> scala> spark.createDataset(Seq(Array(Array(1))))
> <console>:24: error: Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases.
> spark.createDataset(Seq(Array(Array(1))))
> ^
> scala> Seq(Array(Array(1))).toDS()
> <console>:24: error: value toDS is not a member of Seq[Array[Array[Int]]]
> Seq(Array(Array(1))).toDS()
> scala> spark.createDataset(Seq(Seq(Seq(1))))
> <console>:24: error: Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases.
> spark.createDataset(Seq(Seq(Seq(1))))
> scala> Seq(Seq(Seq(1))).toDS()
> <console>:24: error: value toDS is not a member of Seq[Seq[Seq[Int]]]
> Seq(Seq(Seq(1))).toDS()
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org