You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2017/08/08 08:34:01 UTC
[jira] [Resolved] (SPARK-21567) Dataset with Tuple of type alias
throws error
[ https://issues.apache.org/jira/browse/SPARK-21567?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-21567.
---------------------------------
Resolution: Fixed
Assignee: Liang-Chi Hsieh
Fix Version/s: 2.3.0
> Dataset with Tuple of type alias throws error
> ---------------------------------------------
>
> Key: SPARK-21567
> URL: https://issues.apache.org/jira/browse/SPARK-21567
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.1.1, 2.2.0
> Environment: verified for spark 2.1.1 and 2.2.0 in sbt build
> Reporter: Tomasz Bartczak
> Assignee: Liang-Chi Hsieh
> Fix For: 2.3.0
>
>
> returning from a map a thing that is a tuple containg another tuple - defined as a type alias - we receive an error.
> minimal reproducible case:
> having a structure like this:
> {code}
> object C {
> type TwoInt = (Int,Int)
> def tupleTypeAlias: TwoInt = (1,1)
> }
> {code}
> when I do:
> {code}
> Seq(1).toDS().map(_ => ("",C.tupleTypeAlias))
> {code}
> I get exception:
> {code}
> type T1 is not a class
> scala.ScalaReflectionException: type T1 is not a class
> at scala.reflect.api.Symbols$SymbolApi$class.asClass(Symbols.scala:275)
> at scala.reflect.internal.Symbols$SymbolContextApiImpl.asClass(Symbols.scala:84)
> at org.apache.spark.sql.catalyst.ScalaReflection$.getClassFromType(ScalaReflection.scala:682)
> at org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$dataTypeFor(ScalaReflection.scala:84)
> at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$10.apply(ScalaReflection.scala:614)
> at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$10.apply(ScalaReflection.scala:607)
> at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
> at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
> at scala.collection.immutable.List.foreach(List.scala:381)
> at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
> at scala.collection.immutable.List.flatMap(List.scala:344)
> at org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:607)
> at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$10.apply(ScalaReflection.scala:619)
> at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$10.apply(ScalaReflection.scala:607)
> at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
> at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
> at scala.collection.immutable.List.foreach(List.scala:381)
> at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
> at scala.collection.immutable.List.flatMap(List.scala:344)
> at org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:607)
> at org.apache.spark.sql.catalyst.ScalaReflection$.serializerFor(ScalaReflection.scala:438)
> at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:71)
> at org.apache.spark.sql.Encoders$.product(Encoders.scala:275)
> at org.apache.spark.sql.LowPrioritySQLImplicits$class.newProductEncoder(SQLImplicits.scala:233)
> at org.apache.spark.sql.SQLImplicits.newProductEncoder(SQLImplicits.scala:33)
> {code}
> in spark 2.1.1 the last exception was 'head of an empty list'
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org