You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Christian Kadner (JIRA)" <ji...@apache.org> on 2015/10/28 02:48:27 UTC

[jira] [Commented] (SPARK-4836) Web UI should display separate information for all stage attempts

    [ https://issues.apache.org/jira/browse/SPARK-4836?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14977551#comment-14977551 ] 

Christian Kadner commented on SPARK-4836:
-----------------------------------------

Hi [~joshrosen], is this still a problem? And if so, do you have a somewhat _"reliable"_ repro scenario or a nifty way to fake this: {quote}"...(job) lost some partitions of that stage and had to run a new stage attempt to recompute one or two tasks from that stage..."{quote}

> Web UI should display separate information for all stage attempts
> -----------------------------------------------------------------
>
>                 Key: SPARK-4836
>                 URL: https://issues.apache.org/jira/browse/SPARK-4836
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 1.1.1, 1.2.0
>            Reporter: Josh Rosen
>
> I've run into some cases where the web UI job page will say that a job took 12 minutes but the sum of that job's stage times is something like 10 seconds.  In this case, it turns out that my job ran a stage to completion (which took, say, 5 minutes) then lost some partitions of that stage and had to run a new stage attempt to recompute one or two tasks from that stage.  As a result, the latest attempt for that stage reports only one or two tasks.  In the web UI, it seems that we only show the latest stage attempt, not all attempts, which can lead to confusing / misleading displays for jobs with failed / partially-recomputed stages.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org