You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/02/27 19:40:45 UTC

[jira] [Assigned] (SPARK-19372) Code generation for Filter predicate including many OR conditions exceeds JVM method size limit

     [ https://issues.apache.org/jira/browse/SPARK-19372?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-19372:
------------------------------------

    Assignee:     (was: Apache Spark)

> Code generation for Filter predicate including many OR conditions exceeds JVM method size limit 
> ------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-19372
>                 URL: https://issues.apache.org/jira/browse/SPARK-19372
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.1.0
>            Reporter: Jay Pranavamurthi
>         Attachments: wide400cols.csv
>
>
> For the attached csv file, the code below causes the exception "org.codehaus.janino.JaninoRuntimeException: Code of method "(Lorg/apache/spark/sql/catalyst/InternalRow;)Z" of class "org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificPredicate" grows beyond 64 KB
> Code:
> {code:borderStyle=solid}
>   val conf = new SparkConf().setMaster("local[1]")
>   val sqlContext = SparkSession.builder().config(conf).getOrCreate().sqlContext
>   val dataframe =
>     sqlContext
>       .read
>       .format("com.databricks.spark.csv")
>       .load("wide400cols.csv")
>   val filter = (0 to 399)
>     .foldLeft(lit(false))((e, index) => e.or(dataframe.col(dataframe.columns(index)) =!= s"column${index+1}"))
>   val filtered = dataframe.filter(filter)
>   filtered.show(100)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org