You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/06/11 09:42:01 UTC

[jira] [Resolved] (SPARK-8289) Provide a specific stack size with all Java implementations to prevent stack overflows with certain tests

     [ https://issues.apache.org/jira/browse/SPARK-8289?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-8289.
------------------------------
       Resolution: Fixed
    Fix Version/s: 1.4.1

Issue resolved by pull request 6727
[https://github.com/apache/spark/pull/6727]

> Provide a specific stack size with all Java implementations to prevent stack overflows with certain tests
> ---------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-8289
>                 URL: https://issues.apache.org/jira/browse/SPARK-8289
>             Project: Spark
>          Issue Type: Bug
>          Components: Tests
>    Affects Versions: 1.5.0
>         Environment: Anywhere whereby the Java vendor is not OpenJDK
>            Reporter: Adam Roberts
>             Fix For: 1.5.0, 1.4.1
>
>
> Default stack sizes differ per Java implementation - so tests can pass for those with higher stack sizes (OpenJDK) but will fail with Oracle or IBM Java owing to lower default sizes. In particular we can see this happening with the JavaALSSuite - with 15 iterations, we get stackoverflow errors with Oracle and IBM Java. We don't with OpenJDK. This JIRA aims to address such an issue by providing a default specified stack size to be used for all Java distributions: 4096k specified for both SBT test args and for Maven test args (changing project/ScalaBuild.scala and pom.xml respectively). 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org