You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/06/30 21:27:06 UTC

[jira] [Commented] (SPARK-3071) Increase default driver memory

    [ https://issues.apache.org/jira/browse/SPARK-3071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14608912#comment-14608912 ] 

Apache Spark commented on SPARK-3071:
-------------------------------------

User 'ilganeli' has created a pull request for this issue:
https://github.com/apache/spark/pull/7132

> Increase default driver memory
> ------------------------------
>
>                 Key: SPARK-3071
>                 URL: https://issues.apache.org/jira/browse/SPARK-3071
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Xiangrui Meng
>
> The current default is 512M, which is usually too small because user also uses driver to do some computation. In local mode, executor memory setting is ignored while only driver memory is used, which provides more incentive to increase the default driver memory.
> I suggest
> 1. 2GB in local mode and warn users if executor memory is set a bigger value
> 2. same as worker memory on an EC2 standalone server



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org