You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "shahid (JIRA)" <ji...@apache.org> on 2019/01/29 06:38:00 UTC
[jira] [Comment Edited] (SPARK-26760) [Spark Incorrect display in
YARN UI Executor Tab when number of cores is 4 and Active Task display as 5
in Executor Tab of YARN UI]
[ https://issues.apache.org/jira/browse/SPARK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16754650#comment-16754650 ]
shahid edited comment on SPARK-26760 at 1/29/19 6:37 AM:
---------------------------------------------------------
[~abhishek.akg] I would like to work on it.
was (Author: shahid):
[~abhishek.akg] I would like to work on it, if no one is working.
> [Spark Incorrect display in YARN UI Executor Tab when number of cores is 4 and Active Task display as 5 in Executor Tab of YARN UI]
> -----------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-26760
> URL: https://issues.apache.org/jira/browse/SPARK-26760
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.4.0
> Environment: Spark 2.4
> Reporter: ABHISHEK KUMAR GUPTA
> Priority: Major
> Attachments: SPARK-26760.png
>
>
> Steps:
> # Launch Spark Shell
> # bin/spark-shell --master yarn --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.initialExecutors=3 --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.executorIdleTimeout=60s --conf spark.dynamicAllocation.maxExecutors=5
> # Submit a Job sc.parallelize(1 to 10000,116000).count()
> # Check the YARN UI Executor Tab for the RUNNING application
> # UI display as Number of cores 4 and Active Tasks column shows as 5
> Expected:
> It Number of Active Tasks should be same as Number of Cores.
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org