You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2019/02/14 00:30:00 UTC

[jira] [Assigned] (SPARK-25261) Standardize the default units of spark.driver|executor.memory

     [ https://issues.apache.org/jira/browse/SPARK-25261?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin reassigned SPARK-25261:
--------------------------------------

    Assignee: Marcelo Vanzin

> Standardize the default units of spark.driver|executor.memory
> -------------------------------------------------------------
>
>                 Key: SPARK-25261
>                 URL: https://issues.apache.org/jira/browse/SPARK-25261
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes, Spark Core, YARN
>    Affects Versions: 2.3.0
>            Reporter: huangtengfei
>            Assignee: Marcelo Vanzin
>            Priority: Minor
>
> From  [SparkContext|https://github.com/ivoson/spark/blob/master/core/src/main/scala/org/apache/spark/SparkContext.scala#L464] and [SparkSubmitCommandBuilder|https://github.com/ivoson/spark/blob/master/launcher/src/main/java/org/apache/spark/launcher/SparkSubmitCommandBuilder.java#L265],we can see that spark.driver.memory and spark.executor.memory are parsed as bytes if no units specified. But in the doc, they are described as mb in default, which may lead to some misunderstanding.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org