You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:15:39 UTC
[jira] [Resolved] (SPARK-18139) Dataset mapGroups with return typ
Seq[Product] produces scala.ScalaReflectionException: object $line262.$read
not found
[ https://issues.apache.org/jira/browse/SPARK-18139?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-18139.
----------------------------------
Resolution: Incomplete
> Dataset mapGroups with return typ Seq[Product] produces scala.ScalaReflectionException: object $line262.$read not found
> -----------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-18139
> URL: https://issues.apache.org/jira/browse/SPARK-18139
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.0.1
> Reporter: Zach Kull
> Priority: Major
> Labels: bulk-closed
>
> mapGroups fails on Dataset if return type is only a Seq[Product]. It succeeds if return type is more complex like Seq[(Int,Product)]. See the following code sample:
> {code}
> case class A(b:Int, c:Int)
> // Sample Dataset[A]
> val ds = ss.createDataset(Seq(A(1,2),A(2,2)))
> // The following aggregation should produce a Dataset[Seq[A]], but FAILS with scala.ScalaReflectionException: object $line262.$read not found.
> val ds2 = ds.groupByKey(_.b).mapGroups{ case (g,i) => (i.toSeq) }
> // Produces Dataset[(Int, Seq[A])] -> OK
> val ds1 = ds.groupByKey(_.b).mapGroups{ case (g,i) => (g,i.toSeq) }
> // reproducable when trying to manuely create the following Encoder
> val e = newProductSeqEncoder[A]
> {code}
> Full Exception:
> scala.ScalaReflectionException: object $line262.$read not found.
> at scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:162)
> at scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:22)
> at $typecreator4$1.apply(<console>:116)
> at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:232)
> at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:232)
> at org.apache.spark.sql.SQLImplicits$$typecreator9$1.apply(SQLImplicits.scala:125)
> at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:232)
> at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:232)
> at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:49)
> at org.apache.spark.sql.SQLImplicits.newProductSeqEncoder(SQLImplicits.scala:125)
> ... 75 elided
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org