You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2019/10/27 01:02:21 UTC

[jira] [Commented] (SPARK-26760) [Spark Incorrect display in SPARK UI Executor Tab when number of cores is 4 and Active Task display as 5 in Executor Tab of SPARK UI]

    [ https://issues.apache.org/jira/browse/SPARK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16960486#comment-16960486 ] 

Sean R. Owen commented on SPARK-26760:
--------------------------------------

Hm, I also wonder: can speculative execution cause this? 

> [Spark Incorrect display in SPARK UI Executor Tab when number of cores is 4 and Active Task display as 5 in Executor Tab of SPARK UI]
> -------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26760
>                 URL: https://issues.apache.org/jira/browse/SPARK-26760
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.0
>         Environment: Spark 2.4
>            Reporter: ABHISHEK KUMAR GUPTA
>            Priority: Major
>         Attachments: SPARK-26760.png, Screenshot from 2019-02-11 15-09-09.png
>
>
> Steps:
>  # Launch Spark Shell 
>  # bin/spark-shell --master yarn  --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.initialExecutors=3 --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.executorIdleTimeout=60s --conf spark.dynamicAllocation.maxExecutors=5
>  # Submit a Job sc.parallelize(1 to 10000,116000).count()
>  # Check the YARN UI Executor Tab for the RUNNING application
>  # UI display as Number of cores 4 and Active Tasks column shows as 5
> Expected:
> It Number of Active Tasks should be same as Number of Cores.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org