You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2017/07/08 05:59:00 UTC
[jira] [Resolved] (SPARK-21337) SQL which has large ‘case when’ expressions may cause code generation beyond 64KB
[ https://issues.apache.org/jira/browse/SPARK-21337?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-21337.
----------------------------------
Resolution: Cannot Reproduce
I am resolving this per above comment and I could not reproduce this against master with the reproducer above. If it is just an improvement without any behaviour change, I believe we should backport it after identifying the JIRA fixing this.
> SQL which has large ‘case when’ expressions may cause code generation beyond 64KB
> ---------------------------------------------------------------------------------
>
> Key: SPARK-21337
> URL: https://issues.apache.org/jira/browse/SPARK-21337
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.1.1
> Environment: spark-2.1.1-hadoop-2.6.0-cdh-5.4.2
> Reporter: fengchaoge
> Fix For: 2.1.1
>
>
> when there are large 'case when ' expressions in spark sql,the CodeGenerator failed to compile it.
> Error message is followed by a huge dump of generated source code,at last failed.
> java.util.concurrent.ExecutionException: java.lang.Exception: failed to compile: org.codehaus.janino.JaninoRuntimeException: Code of method "apply_9$(Lorg/apache/spark/sql/catalyst/expressions/GeneratedClass$SpecificUnsafeProjection;Lorg/apache/spark/sql/catalyst/InternalRow;)V" of class "org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection" grows beyond 64 KB.
> It seems that SPARK-13242 has solved this problem in spark-1.6.2,however it apparence in spark-2.1.1 again. https://issues.apache.org/jira/browse/SPARK-13242.
> is there something wrong ?
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org