You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2015/07/02 08:12:06 UTC

[jira] [Closed] (SPARK-3071) Increase default driver memory

     [ https://issues.apache.org/jira/browse/SPARK-3071?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Andrew Or closed SPARK-3071.
----------------------------
       Resolution: Fixed
    Fix Version/s: 1.5.0

> Increase default driver memory
> ------------------------------
>
>                 Key: SPARK-3071
>                 URL: https://issues.apache.org/jira/browse/SPARK-3071
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.4.2
>            Reporter: Xiangrui Meng
>            Assignee: Ilya Ganelin
>             Fix For: 1.5.0
>
>
> The current default is 512M, which is usually too small because user also uses driver to do some computation. In local mode, executor memory setting is ignored while only driver memory is used, which provides more incentive to increase the default driver memory.
> I suggest
> 1. 2GB in local mode and warn users if executor memory is set a bigger value
> 2. same as worker memory on an EC2 standalone server



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org