You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wang Haihua (JIRA)" <ji...@apache.org> on 2017/09/23 14:21:00 UTC

[jira] [Commented] (SPARK-21157) Report Total Memory Used by Spark Executors

    [ https://issues.apache.org/jira/browse/SPARK-21157?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16177814#comment-16177814 ] 

Wang Haihua commented on SPARK-21157:
-------------------------------------

Dose the include the RES memory of one executor? 

It does useful to trace memory usage. We implementa a simple demp (Collect the RES memory usage by simulating the YARN container memory monitor and send the metrics by heartbeat to driver ) one year ago, for we need collect real memory usage for statistics of cluster and application optimization. 

> Report Total Memory Used by Spark Executors
> -------------------------------------------
>
>                 Key: SPARK-21157
>                 URL: https://issues.apache.org/jira/browse/SPARK-21157
>             Project: Spark
>          Issue Type: Improvement
>          Components: Input/Output
>    Affects Versions: 2.1.1
>            Reporter: Jose Soltren
>         Attachments: TotalMemoryReportingDesignDoc.pdf
>
>
> Building on some of the core ideas of SPARK-9103, this JIRA proposes tracking total memory used by Spark executors, and a means of broadcasting, aggregating, and reporting memory usage data in the Spark UI.
> Here, "total memory used" refers to memory usage that is visible outside of Spark, to an external observer such as YARN, Mesos, or the operating system. The goal of this enhancement is to give Spark users more information about how Spark clusters are using memory. Total memory will include non-Spark JVM memory and all off-heap memory.
> Please consult the attached design document for further details.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org