You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2019/01/04 21:33:00 UTC

[jira] [Updated] (SPARK-26539) Remove spark.memory.useLegacyMode memory settings + StaticMemoryManager

     [ https://issues.apache.org/jira/browse/SPARK-26539?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen updated SPARK-26539:
------------------------------
    Docs Text: The legacy memory manager, enabled by spark.memory.useLegacyMode, has been removed in Spark 3. It is effectively always false, its default since Spark 1.6. Associated settings spark.shuffle.memoryFraction, spark.storage.memoryFraction,and spark.storage.unrollFraction have been removed as well.
       Labels: release-notes  (was: )

I should say I think this is still open to discussion, but per a quick dev@ thread about it, didn't hear strong objections.

> Remove spark.memory.useLegacyMode memory settings + StaticMemoryManager
> -----------------------------------------------------------------------
>
>                 Key: SPARK-26539
>                 URL: https://issues.apache.org/jira/browse/SPARK-26539
>             Project: Spark
>          Issue Type: Task
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Sean Owen
>            Assignee: Sean Owen
>            Priority: Major
>              Labels: release-notes
>
> The old memory manager was superseded by UnifiedMemoryManager in Spark 1.6, and has been the default since. I think we could remove it to simplify the code a little, but more importantly to reduce the variety of memory settings users are confronted with when using Spark.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org