You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/10/08 05:45:09 UTC
[jira] [Resolved] (SPARK-22868) 64KB JVM bytecode limit problem
with aggregation
[ https://issues.apache.org/jira/browse/SPARK-22868?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-22868.
----------------------------------
Resolution: Incomplete
> 64KB JVM bytecode limit problem with aggregation
> ------------------------------------------------
>
> Key: SPARK-22868
> URL: https://issues.apache.org/jira/browse/SPARK-22868
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 2.2.1, 2.3.0
> Reporter: Kazuaki Ishizaki
> Priority: Major
> Labels: bulk-closed
>
> The following programs can throw an exception due to the 64KB JVM bytecode limit
> {code}
> val df = spark.sparkContext.parallelize(
> Seq((1.1, 2.2, 3.3, 4.4, 5.5, 6.6, 7.7, 8.8, 9.9, 10.0, 11.1, 12.2, 13.3, 14.4, 15.5, 16.6, 17.7, 18.8, 19.9, 20.0, 21.1, 22.2)),
> 1).toDF()
> df.agg(
> kurtosis('_1), kurtosis('_2), kurtosis('_3), kurtosis('_4), kurtosis('_5),
> kurtosis('_6), kurtosis('_7), kurtosis('_8), kurtosis('_9), kurtosis('_10),
> kurtosis('_11), kurtosis('_12), kurtosis('_13), kurtosis('_14), kurtosis('_15)
> ).collect
> df.groupBy('_22)
> .agg(
> kurtosis('_1), kurtosis('_2), kurtosis('_3), kurtosis('_4), kurtosis('_5),
> kurtosis('_6), kurtosis('_7), kurtosis('_8), kurtosis('_9), kurtosis('_10),
> kurtosis('_11), kurtosis('_12), kurtosis('_13), kurtosis('_14), kurtosis('_15)
> ).collect
> df.groupBy(
> round('_1, 0), round('_2, 0), round('_3, 0), round('_4, 0), round('_5, 0),
> round('_6, 0), round('_7, 0), round('_8, 0), round('_9, 0), round('_10, 0))
> .agg(
> kurtosis('_1), kurtosis('_2), kurtosis('_3), kurtosis('_4), kurtosis('_5),
> kurtosis('_6), kurtosis('_7)
> ).collect
> {code}
> */
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org