You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ron Kitay (JIRA)" <ji...@apache.org> on 2018/06/30 22:30:00 UTC
[jira] [Commented] (SPARK-17181) [Spark2.0 web ui]The status of the
certain jobs is still displayed as running even if all the stages of this
job have already finished
[ https://issues.apache.org/jira/browse/SPARK-17181?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16528919#comment-16528919 ]
Ron Kitay commented on SPARK-17181:
-----------------------------------
[~tgraves] - This seems like a very old issue, however it is not minor as it may causes a severe memory leakage in {{JobProgressListener}} - is this still a real issue or has this been refactored in Spark 2.2?
> [Spark2.0 web ui]The status of the certain jobs is still displayed as running even if all the stages of this job have already finished
> ---------------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-17181
> URL: https://issues.apache.org/jira/browse/SPARK-17181
> Project: Spark
> Issue Type: Bug
> Components: Web UI
> Affects Versions: 2.0.0
> Reporter: marymwu
> Priority: Minor
> Attachments: job1000-1.png, job1000-2.png
>
>
> [Spark2.0 web ui]The status of the certain jobs is still displayed as running even if all the stages of this job have already finished
> Note: not sure what kind of jobs will encounter this problem
> The following log shows that job 1000 has already been done, but on spark2.0 web ui, the status of job 1000 is still displayed as running, see attached file
> 16/08/22 16:01:29 INFO DAGScheduler: dag send msg, result task done, job: 1000
> 16/08/22 16:01:29 INFO DAGScheduler: Job 1000 finished: run at AccessController.java:-2, took 4.664319 s
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org