You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiangrui Meng (JIRA)" <ji...@apache.org> on 2014/08/15 17:44:18 UTC

[jira] [Created] (SPARK-3071) Increase default driver memory

Xiangrui Meng created SPARK-3071:
------------------------------------

             Summary: Increase default driver memory
                 Key: SPARK-3071
                 URL: https://issues.apache.org/jira/browse/SPARK-3071
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
            Reporter: Xiangrui Meng


The current default is 512M, which is usually too small because user also uses driver to do some computation. In local mode, executor memory setting is ignored while only driver memory is used, which provides more incentive to increase the default driver memory.

I suggest

1. 2GB in local mode and warn users if executor memory is set a bigger value
2. same as worker memory on an EC2 standalone server



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org