You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2018/09/25 03:06:00 UTC
[jira] [Resolved] (SPARK-25503) [Spark Job History] Total task
message in stage page is ambiguous
[ https://issues.apache.org/jira/browse/SPARK-25503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-25503.
-----------------------------------
Resolution: Fixed
Fix Version/s: 2.4.0
2.3.3
Issue resolved by pull request 22525
[https://github.com/apache/spark/pull/22525]
> [Spark Job History] Total task message in stage page is ambiguous
> -----------------------------------------------------------------
>
> Key: SPARK-25503
> URL: https://issues.apache.org/jira/browse/SPARK-25503
> Project: Spark
> Issue Type: Bug
> Components: Web UI
> Affects Versions: 2.3.1
> Reporter: ABHISHEK KUMAR GUPTA
> Assignee: shahid
> Priority: Major
> Fix For: 2.3.3, 2.4.0
>
>
> *Steps:*
> 1. Spark installed and running properly.
> 2. spark.ui.retainedTask=100000 ( it is default value )
> 3.Launch Spark shell ./spark-shell --master yarn
> 4. Create a spark-shell application with a single job and 500000 task
> val rdd = sc.parallelize(1 to 500000, 500000)
> rdd.count
> 5. Launch Job History Page and go to spark-shell application created above under Incomplete Task
> 6. Right click and got to Job page of the application and from there click and launch Stage Page
> 7. Launch the Stage Id page for the specific Stage Id for the above created job
> 8. Scroll down and check for the task msg above Pagination Panel
> It Displays *Task( 100000, Showing 500000)*
> *Actual Result:*
> It displayed Task( 100000, Showing 500000)
> *Expected Result:*
> Since retainedTask=100000 and it should show 100000 task
> So message should be Task( 500000, Showing 100000)
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org