You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Niranjan Artal (Jira)" <ji...@apache.org> on 2019/12/10 22:11:00 UTC

[jira] [Updated] (SPARK-30209) Display stageId, attemptId, taskId with SQL max metric in UI

     [ https://issues.apache.org/jira/browse/SPARK-30209?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Niranjan Artal updated SPARK-30209:
-----------------------------------
    Description: It would be helpful if we could add stageId, stage attemptId and taskId for in SQL UI for each of the max metrics values.  These additional metrics help in debugging the jobs quicker.  For a  given operator, it will be easy to identify the task which is taking maximum time to complete from the Spark UI.  (was: It would be helpful if we could add stageId, stage attemptId and taskId in SQL UI.  These additional metrics help in debugging the jobs quicker.  For a  given operator, it will be easy to identify the task which is taking maximum time to complete from the Spark UI.)

> Display stageId, attemptId, taskId with SQL max metric in UI
> ------------------------------------------------------------
>
>                 Key: SPARK-30209
>                 URL: https://issues.apache.org/jira/browse/SPARK-30209
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL, Web UI
>    Affects Versions: 3.0.0
>            Reporter: Niranjan Artal
>            Priority: Major
>
> It would be helpful if we could add stageId, stage attemptId and taskId for in SQL UI for each of the max metrics values.  These additional metrics help in debugging the jobs quicker.  For a  given operator, it will be easy to identify the task which is taking maximum time to complete from the Spark UI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org