You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/01/16 13:33:26 UTC

[jira] [Assigned] (SPARK-19244) Sort MemoryConsumers according to their memory usage when spilling

     [ https://issues.apache.org/jira/browse/SPARK-19244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-19244:
------------------------------------

    Assignee:     (was: Apache Spark)

> Sort MemoryConsumers according to their memory usage when spilling
> ------------------------------------------------------------------
>
>                 Key: SPARK-19244
>                 URL: https://issues.apache.org/jira/browse/SPARK-19244
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Liang-Chi Hsieh
>
> In `TaskMemoryManager `, when we acquire memory by calling `acquireExecutionMemory` and we can't acquire required memory, we will try to spill other memory consumers.
> Currently, we simply iterates the memory consumers in a hash set. Normally each time the consumer will be iterated in the same order.
> The first issue is that we might spill additional consumers. For example, if consumer 1 uses 10MB, consumer 2 uses 50MB, then consumer 3 acquires 100MB but we can only get 60MB and spilling is needed. We might spill both consumer 1 and consumer 2. But we actually just need to spill consumer 2 and get the required 100MB.
> The second issue is that if we spill consumer 1 in first time spilling. After a while, consumer 1 now uses 5MB. Then consumer 4 may acquire some memory and spilling is needed again. Because we iterate the memory consumers in the same order, we will spill consumer 1 again. So for consumer 1, we will produce many small spilling files.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org