You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Zhong (JIRA)" <ji...@apache.org> on 2016/06/13 04:02:21 UTC

[jira] [Created] (SPARK-15910) Schema is not checked when converting DataFrame to Dataset using Kryo encoder

Sean Zhong created SPARK-15910:
----------------------------------

             Summary: Schema is not checked when converting DataFrame to Dataset using Kryo encoder
                 Key: SPARK-15910
                 URL: https://issues.apache.org/jira/browse/SPARK-15910
             Project: Spark
          Issue Type: Bug
          Components: SQL
            Reporter: Sean Zhong


Here is the case to reproduce it:

{code}
scala> import org.apache.spark.sql.Encoders._
scala> import org.apache.spark.sql.Encoders
scala> import org.apache.spark.sql.Encoder

scala> case class B(b: Int)

scala> implicit val encoder = Encoders.kryo[B]
encoder: org.apache.spark.sql.Encoder[B] = class[value[0]: binary]

scala> val ds = Seq((1)).toDF("b").as[B].map(identity)
ds: org.apache.spark.sql.Dataset[B] = [value: binary]

scala> ds.show()
16/06/10 13:46:51 ERROR CodeGenerator: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 45, Column 168: No applicable constructor/method found for actual parameters "int"; candidates are: "public static java.nio.ByteBuffer java.nio.ByteBuffer.wrap(byte[])", "public static java.nio.ByteBuffer java.nio.ByteBuffer.wrap(byte[], int, int)"
...
{code}

The expected behavior is to report schema check failure earlier when creating Dataset using {code}dataFrame.as[B]{code}




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org