You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kevin Appel (Jira)" <ji...@apache.org> on 2019/09/20 14:03:00 UTC

[jira] [Commented] (SPARK-27169) number of active tasks is negative on executors page

    [ https://issues.apache.org/jira/browse/SPARK-27169?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16934424#comment-16934424 ] 

Kevin Appel commented on SPARK-27169:
-------------------------------------

I have run into this similar issue before on jobs with over many thousand tasks, the events are getting dropped somewhere and the UI is showing gaps or anomalies in the metrics, such as the stages don't appear to be completed or the executors are showing negative metrics.

Through trial and error using the following is giving reliable metrics now:

--conf spark.scheduler.listenerbus.eventqueue.size=200000

> number of active tasks is negative on executors page
> ----------------------------------------------------
>
>                 Key: SPARK-27169
>                 URL: https://issues.apache.org/jira/browse/SPARK-27169
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 2.3.2
>            Reporter: acupple
>            Priority: Minor
>         Attachments: QQ20190315-102215.png, QQ20190315-102235.png, image-2019-03-19-15-17-25-522.png, image-2019-03-19-15-21-03-766.png, job_1924.log, stage_3511.log
>
>
> I use spark to process some data in HDFS and HBASE, I use one thread consume message from a queue, and then submit to a thread pool(16 fix size)for spark processor.
> But when run for some time, the active jobs will be thousands, and number of active tasks are negative.
> Actually, these jobs are already done when I check driver logs。
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org