You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/07/02 22:01:05 UTC

[jira] [Commented] (SPARK-8776) Increase the default MaxPermSize

    [ https://issues.apache.org/jira/browse/SPARK-8776?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14612436#comment-14612436 ] 

Apache Spark commented on SPARK-8776:
-------------------------------------

User 'yhuai' has created a pull request for this issue:
https://github.com/apache/spark/pull/7196

> Increase the default MaxPermSize
> --------------------------------
>
>                 Key: SPARK-8776
>                 URL: https://issues.apache.org/jira/browse/SPARK-8776
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Yin Huai
>
> Since 1.4.0, Spark SQL has isolated class loaders for seperating hive dependencies on metastore and execution, which increases the memory consumption of PermGen. How about we increase the default size from 128m to 256m? Seems the change we need to make is https://github.com/apache/spark/blob/3c0156899dc1ec1f7dfe6d7c8af47fa6dc7d00bf/launcher/src/main/java/org/apache/spark/launcher/AbstractCommandBuilder.java#L139. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org