You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Frederik Schreiber (Jira)" <ji...@apache.org> on 2020/02/03 08:27:00 UTC

[jira] [Commented] (SPARK-22510) Exceptions caused by 64KB JVM bytecode or 64K constant pool entry limit

    [ https://issues.apache.org/jira/browse/SPARK-22510?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17028763#comment-17028763 ] 

Frederik Schreiber commented on SPARK-22510:
--------------------------------------------

Hi [~smilegator], [~kiszk]

we are using Spark 2.4.0 and currently having trouble with 64KB Exception. Our Dataframe has about 42 Columns, so we are wondering because all of these bugs are closed. Are there still known bugs which leads to that exception? Can this exception appear on complex queries/dataframes by design?

spark.sql.codegen.maxFields is set to 100

 

Are there same suggestion to avoid that error?

 

We although tried with Spark version 2.4.4 with same results.

> Exceptions caused by 64KB JVM bytecode or 64K constant pool entry limit 
> ------------------------------------------------------------------------
>
>                 Key: SPARK-22510
>                 URL: https://issues.apache.org/jira/browse/SPARK-22510
>             Project: Spark
>          Issue Type: Umbrella
>          Components: SQL
>    Affects Versions: 2.2.0
>            Reporter: Xiao Li
>            Assignee: Kazuaki Ishizaki
>            Priority: Major
>              Labels: bulk-closed, releasenotes
>
> Codegen can throw an exception due to the 64KB JVM bytecode or 64K constant pool entry limit.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org