You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "handong (Jira)" <ji...@apache.org> on 2023/02/02 13:51:00 UTC

[jira] [Updated] (SPARK-42293) why executor memory used is shown greater than total available memory on spark ui

     [ https://issues.apache.org/jira/browse/SPARK-42293?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

handong updated SPARK-42293:
----------------------------
    Description: 
*I have a spark  streaming job that is running for around last 3 weeks. When I open the Executors tab on spark web UI, it shows*
 # {{memory used - 36.1GB}}
 # {\{ total available memory for storage - 3.2GB}}

*Please refer to the below screenshot of Spark UI*

!https://i.stack.imgur.com/nmk39.jpg!

  was:
I have a spark  streaming job that is running for around last 3 weeks. When I open the Executors tab on spark web UI, it shows
 # {{memory used - 36.1GB}}
 # {{ total available memory for storage - 3.2GB}}
Please refer to the below screenshot of Spark UI

!https://i.stack.imgur.com/nmk39.jpg!


> why executor memory used is shown greater than total available memory on spark ui
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-42293
>                 URL: https://issues.apache.org/jira/browse/SPARK-42293
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.5
>            Reporter: handong
>            Priority: Major
>
> *I have a spark  streaming job that is running for around last 3 weeks. When I open the Executors tab on spark web UI, it shows*
>  # {{memory used - 36.1GB}}
>  # {\{ total available memory for storage - 3.2GB}}
> *Please refer to the below screenshot of Spark UI*
> !https://i.stack.imgur.com/nmk39.jpg!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org