You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/10/27 12:04:59 UTC

[jira] [Commented] (SPARK-18139) Dataset mapGroups with return typ Seq[Product] produces scala.ScalaReflectionException: object $line262.$read not found

    [ https://issues.apache.org/jira/browse/SPARK-18139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15611674#comment-15611674 ] 

Sean Owen commented on SPARK-18139:
-----------------------------------

I'm pretty sure this is just another instance of "case classes don't quite work with the shell in Spark", and related JIRAs.

> Dataset mapGroups with return typ Seq[Product] produces scala.ScalaReflectionException: object $line262.$read not found
> -----------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-18139
>                 URL: https://issues.apache.org/jira/browse/SPARK-18139
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.0.1
>            Reporter: Zach Kull
>
> mapGroups fails on Dataset if return type is only a Seq[Product]. It succeeds if return type is more complex like Seq[(Int,Product)]. See the following code sample:
> {code}
> case class A(b:Int, c:Int)
> // Sample Dataset[A]
> val ds = ss.createDataset(Seq(A(1,2),A(2,2)))
> // The following aggregation should produce a Dataset[Seq[A]], but FAILS with scala.ScalaReflectionException: object $line262.$read not found.
> val ds2 = ds.groupByKey(_.b).mapGroups{ case (g,i) => (i.toSeq) }
> // Produces Dataset[(Int, Seq[A])] -> OK
> val ds1 = ds.groupByKey(_.b).mapGroups{ case (g,i) => (g,i.toSeq) }
> // reproducable when trying to manuely create the following Encoder
> val e = newProductSeqEncoder[A]
> {code}
> Full Exception:
> scala.ScalaReflectionException: object $line262.$read not found.
>   at scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:162)
>   at scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:22)
>   at $typecreator4$1.apply(<console>:116)
>   at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:232)
>   at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:232)
>   at org.apache.spark.sql.SQLImplicits$$typecreator9$1.apply(SQLImplicits.scala:125)
>   at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:232)
>   at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:232)
>   at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:49)
>   at org.apache.spark.sql.SQLImplicits.newProductSeqEncoder(SQLImplicits.scala:125)
>   ... 75 elided



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org