You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Aaron Davidson (JIRA)" <ji...@apache.org> on 2014/05/12 03:19:14 UTC

[jira] [Resolved] (SPARK-1796) spark-submit does not set driver memory correctly

     [ https://issues.apache.org/jira/browse/SPARK-1796?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Aaron Davidson resolved SPARK-1796.
-----------------------------------

    Resolution: Fixed

> spark-submit does not set driver memory correctly
> -------------------------------------------------
>
>                 Key: SPARK-1796
>                 URL: https://issues.apache.org/jira/browse/SPARK-1796
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core, YARN
>            Reporter: Patrick Wendell
>            Assignee: Patrick Wendell
>            Priority: Blocker
>             Fix For: 1.0.0
>
>
> This is not working correctly at present. We should assume that SPARK_DRIVER_MEM should be set unless --deploy-mode is explicitly set to "cluster".
> {code}
> patrick@patrick-t430s:~/Documents/spark$ SPARK_PRINT_LAUNCH_COMMAND=1 ./bin/spark-shell --driver-mem
> ory 2g
> Spark Command: /usr/lib/jvm/jdk1.7.0_25/bin/java -cp ::/home/patrick/Documents/spark/conf:/home/patrick/Documents/spark/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.0.4.jar -Xms512m -Xmx512m org.apache.spark.deploy.SparkSubmit spark-internal --driver-memory 2g --class org.apache.spark.repl.Main
> ========================================
> {code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)