You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2015/08/31 21:28:45 UTC

[jira] [Comment Edited] (SPARK-10375) Setting the driver memory with SparkConf().set("spark.driver.memory","1g") does not work

    [ https://issues.apache.org/jira/browse/SPARK-10375?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14723902#comment-14723902 ] 

Marcelo Vanzin edited comment on SPARK-10375 at 8/31/15 7:27 PM:
-----------------------------------------------------------------

You can't set the driver memory after the driver has already started. If you want to set it, you need to set if either in your config file or in the spark-submit command ({{--driver-memory 1g}} or {{--conf spark.driver.memory=1g}}).

The UI discrepancy is unfortunate but not easy (nor important enough) to fix, at least at the moment. It affects quite a lot of properties that can't really change after the context is initialized.


was (Author: vanzin):
You can't set the driver memory after the driver has already started. If you want to set it, you need to set if either in your config file or in the spark-submit command ({{--driver-memory 1g}} or {{--conf spark.driver.memory=1g").

The UI discrepancy is unfortunate but not easy (nor important enough) to fix, at least at the moment. It affects quite a lot of properties that can't really change after the context is initialized.

> Setting the driver memory with SparkConf().set("spark.driver.memory","1g") does not work
> ----------------------------------------------------------------------------------------
>
>                 Key: SPARK-10375
>                 URL: https://issues.apache.org/jira/browse/SPARK-10375
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.3.0
>         Environment: Running with yarn
>            Reporter: Thomas
>            Priority: Minor
>
> When running pyspark 1.3.0 with yarn, the following code has no effect:
> pyspark.SparkConf().set("spark.driver.memory","1g")
> The Environment tab in yarn shows that the driver has 1g, however, the Executors tab only shows 512 M (the default value) for the driver memory.  This issue goes away when the driver memory is specified via the command line (i.e. --driver-memory 1g)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org