You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Richard Marscher (JIRA)" <ji...@apache.org> on 2016/06/16 15:25:06 UTC

[jira] [Issue Comment Deleted] (SPARK-15786) joinWith bytecode generation calling ByteBuffer.wrap with InternalRow

     [ https://issues.apache.org/jira/browse/SPARK-15786?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Richard Marscher updated SPARK-15786:
-------------------------------------
    Comment: was deleted

(was: Not sure I completely understand your last point. If those lines are removed then there are no encoders available for Option. Does Kryo encoder have special restrictions that first class encoders don't? Or is this more about not being able to "cast" at all with the as syntax.)

> joinWith bytecode generation calling ByteBuffer.wrap with InternalRow
> ---------------------------------------------------------------------
>
>                 Key: SPARK-15786
>                 URL: https://issues.apache.org/jira/browse/SPARK-15786
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.1, 2.0.0
>            Reporter: Richard Marscher
>            Assignee: Sean Zhong
>             Fix For: 2.0.0
>
>
> {code}java.lang.RuntimeException: Error while decoding: java.util.concurrent.ExecutionException: java.lang.Exception: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 36, Column 107: No applicable constructor/method found for actual parameters "org.apache.spark.sql.catalyst.InternalRow"; candidates are: "public static java.nio.ByteBuffer java.nio.ByteBuffer.wrap(byte[])", "public static java.nio.ByteBuffer java.nio.ByteBuffer.wrap(byte[], int, int)"{code}
> I have been trying to use joinWith along with Option data types to get an approximation of the RDD semantics for outer joins with Dataset to have a nicer API for Scala. However, using the Dataset.as[] syntax leads to bytecode generation trying to pass an InternalRow object into the ByteBuffer.wrap function which expects byte[] with or without a couple int qualifiers.
> I have a notebook reproducing this against 2.0 preview in Databricks Community Edition: https://databricks-prod-cloudfront.cloud.databricks.com/public/4027ec902e239c93eaaa8714f173bcfc/160347920874755/1039589581260901/673639177603143/latest.html



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org