You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "David Jung (JIRA)" <ji...@apache.org> on 2016/08/24 14:28:20 UTC

[jira] [Commented] (SPARK-16845) org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificOrdering" grows beyond 64 KB

    [ https://issues.apache.org/jira/browse/SPARK-16845?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15435016#comment-15435016 ] 

David Jung commented on SPARK-16845:
------------------------------------

In addition to receiving this error when attempting to call pyspark.ml.regression.RandomForestRegressor.fit() on a DataFrame with 700+ columns, we also see it when just calling DataFrame.show() on the same wide DataFrame.  Our modeling requires many thousands of features, so this is a blocking issue for mllib for us.
I'll see if digging into it further can shed any further light on it.


> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificOrdering" grows beyond 64 KB
> ---------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16845
>                 URL: https://issues.apache.org/jira/browse/SPARK-16845
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API, ML, MLlib
>    Affects Versions: 2.0.0
>            Reporter: hejie
>
> I have a wide table(400 columns), when I try fitting the traindata on all columns,  the fatal error occurs. 
> 	... 46 more
> Caused by: org.codehaus.janino.JaninoRuntimeException: Code of method "(Lorg/apache/spark/sql/catalyst/InternalRow;Lorg/apache/spark/sql/catalyst/InternalRow;)I" of class "org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificOrdering" grows beyond 64 KB
> 	at org.codehaus.janino.CodeContext.makeSpace(CodeContext.java:941)
> 	at org.codehaus.janino.CodeContext.write(CodeContext.java:854)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org