You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/06/13 04:18:21 UTC

[jira] [Assigned] (SPARK-15910) Schema is not checked when converting DataFrame to Dataset using Kryo encoder

     [ https://issues.apache.org/jira/browse/SPARK-15910?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-15910:
------------------------------------

    Assignee: Apache Spark

> Schema is not checked when converting DataFrame to Dataset using Kryo encoder
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-15910
>                 URL: https://issues.apache.org/jira/browse/SPARK-15910
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Sean Zhong
>            Assignee: Apache Spark
>
> Here is the case to reproduce it:
> {code}
> scala> import org.apache.spark.sql.Encoders._
> scala> import org.apache.spark.sql.Encoders
> scala> import org.apache.spark.sql.Encoder
> scala> case class B(b: Int)
> scala> implicit val encoder = Encoders.kryo[B]
> encoder: org.apache.spark.sql.Encoder[B] = class[value[0]: binary]
> scala> val ds = Seq((1)).toDF("b").as[B].map(identity)
> ds: org.apache.spark.sql.Dataset[B] = [value: binary]
> scala> ds.show()
> 16/06/10 13:46:51 ERROR CodeGenerator: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 45, Column 168: No applicable constructor/method found for actual parameters "int"; candidates are: "public static java.nio.ByteBuffer java.nio.ByteBuffer.wrap(byte[])", "public static java.nio.ByteBuffer java.nio.ByteBuffer.wrap(byte[], int, int)"
> ...
> {code}
> The expected behavior is to report schema check failure earlier when creating Dataset using {code}dataFrame.as[B]{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org