You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by marsishandsome <gi...@git.apache.org> on 2015/02/12 14:23:47 UTC

[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

GitHub user marsishandsome opened a pull request:

    https://github.com/apache/spark/pull/4567

    [SPARK-5771] Number of Cores in Completed Applications of Standalone Master Web Page always be 0 if sc.stop() is called

    In Standalone mode, the number of cores in Completed Applications of the Master Web Page will always be zero, if sc.stop() is called.
    But the number will always be right, if sc.stop() is not called.
    The reason maybe: 
    after sc.stop() is called, the function removeExecutor of class ApplicationInfo will be called, thus reduce the variable coresGranted to zero. The variable coresGranted is used to display the number of Cores on the Web Page.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/marsishandsome/spark Spark5771

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/4567.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #4567
    
----
commit cfbd97d68e322ce25882a9aa08ae665c6ba24ad0
Author: guliangliang <gu...@qiyi.com>
Date:   2015-02-12T13:21:38Z

    [SPARK-5771] Number of Cores in Completed Applications of Standalone Master Web Page always be 0 if sc.stop() is called

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74390919
  
    FWIW I like the current solution in the screenshot.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by marsishandsome <gi...@git.apache.org>.
Github user marsishandsome commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74191436
  
    @andrewor14 What about showing the core number users requested (total-executor-cores) for now? 
    Users or Administrators (at least me) may want to see this information.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74069778
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-76313435
  
    Looks like if `defaultCores` is not set, Spark will choose `Int.MaxValue` as a requestedCores, which will be displayed in the UI as:
    
    ![UI](https://dl.dropboxusercontent.com/u/19230832/app_row.PNG)
    
    I think this **Cores Requested** may be a little confused, I think it would be better to set as *MAX*, what's your opinion, @andrewor14.
    
    Thanks. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4567#discussion_r24717155
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/master/ui/MasterPage.scala ---
    @@ -50,12 +50,16 @@ private[spark] class MasterPage(parent: MasterWebUI) extends WebUIPage("") {
         val workers = state.workers.sortBy(_.id)
         val workerTable = UIUtils.listingTable(workerHeaders, workerRow, workers)
     
    -    val appHeaders = Seq("Application ID", "Name", "Cores", "Memory per Node", "Submitted Time",
    -      "User", "State", "Duration")
    +    val activeAppHeaders = Seq("Application ID", "Name", "Cores Using",
    --- End diff --
    
    Nit: Cores Using -> Cores In Use


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/4567


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by squito <gi...@git.apache.org>.
Github user squito commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74094309
  
    @jerryshao I agree with @marsishandsome on this one.  The issue is not what you should display for # of cores when an app stops -- the issue that it should display something consistent whether or not the app calls `sc.stop`.  (Notice that in the attached img, both apps are completed.)
    
    I don't have any strong feelings whether it should be 0 or max or avg, but I do think it should be consistent whether or not `sc.stop` was called.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74071365
  
    I also agree with that interpretation, in which case this column is unuseful for completed apps, and could be removed (or blanked, to retain alignment). If it shows anything it should be max or average or something.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74074187
  
    I think it really depends on how you interpret this :).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4567#discussion_r24963818
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/master/ui/MasterPage.scala ---
    @@ -173,6 +177,30 @@ private[spark] class MasterPage(parent: MasterWebUI) extends WebUIPage("") {
           <td>
             {app.coresGranted}
           </td>
    +      <td>
    +        {app.requestedCores}
    +      </td>
    +      <td sorttable_customkey={app.desc.memoryPerSlave.toString}>
    +        {Utils.megabytesToString(app.desc.memoryPerSlave)}
    +      </td>
    +      <td>{UIUtils.formatDate(app.submitDate)}</td>
    +      <td>{app.desc.user}</td>
    +      <td>{app.state.toString}</td>
    +      <td>{UIUtils.formatDuration(app.duration)}</td>
    +    </tr>
    +  }
    +
    +  private def completeAppRow(app: ApplicationInfo): Seq[Node] = {
    +    <tr>
    --- End diff --
    
    yeah, just have a flag `active: Boolean` or something


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74071187
  
    From my understanding, this is a expected behavior since app is finished, number of cores occupied return to ZERO. So you want to display the cores used even when the app is finished, is that right?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4567#discussion_r24580189
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/master/ApplicationInfo.scala ---
    @@ -79,6 +81,7 @@ private[spark] class ApplicationInfo(
         val exec = new ExecutorDesc(newExecutorId(useID), this, worker, cores, desc.memoryPerSlave)
         executors(exec.id) = exec
         coresGranted += cores
    +    coresMax = if(coresGranted > coresMax) coresGranted else coresMax
    --- End diff --
    
    Might be able to use `math.max` here. Also should the column heading say "Cores (max)" in the case of completed apps only? Otherwise LGTM.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-85742994
  
    Per discussion on #4841 I'm reverting this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by marsishandsome <gi...@git.apache.org>.
Github user marsishandsome commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74074999
  
    ![default](https://cloud.githubusercontent.com/assets/5574887/6168735/c8600e78-b302-11e4-8c3b-1e04854d0735.PNG)
    
    I'm really confused with the core number show above.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-75939381
  
    Looks like everything has been addressed and also LGTM. I'll wait a beat for any more feedback but think this can be merged. (@andrewor14 I'll link your suggestion on SPARK-5076 to see what Josh thinks.)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by marsishandsome <gi...@git.apache.org>.
Github user marsishandsome commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74072604
  
    The problem here is that the Cores displayed here is confused, because it depends on whether sc.stop() is called or not.
    
    So I choose to display core max here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-75886922
  
      [Test build #27926 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/27926/consoleFull) for   PR 4567 at commit [`694796e`](https://github.com/apache/spark/commit/694796ed65b7b0bd533c147d5a3561b6e575c9f3).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by marsishandsome <gi...@git.apache.org>.
Github user marsishandsome commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-76342480
  
    Thanks @jerryshao 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-75886480
  
    retest this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74198708
  
    Thanks @squito , seems this is really a issue, the problem is that cores might be varied accordingly as executor failed or added (dynamic allocation), `coresMax` might not be a good choice, what about the initial cores requested `myMaxCores` and rename the header? Just my thoughts :). 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by marsishandsome <gi...@git.apache.org>.
Github user marsishandsome commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74220077
  
    ![default](https://cloud.githubusercontent.com/assets/5574887/6184443/a59134c2-b39c-11e4-8206-f546276f80c7.PNG)
    
    For Running Applications, Cores Using and Cores Requested will be shown.
    For Completed Applications, only Cores Requested will be shown.
    
    What do you think about it?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-75894700
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/27926/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4567#discussion_r24963779
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/master/ui/MasterPage.scala ---
    @@ -50,12 +50,16 @@ private[spark] class MasterPage(parent: MasterWebUI) extends WebUIPage("") {
         val workers = state.workers.sortBy(_.id)
         val workerTable = UIUtils.listingTable(workerHeaders, workerRow, workers)
     
    -    val appHeaders = Seq("Application ID", "Name", "Cores", "Memory per Node", "Submitted Time",
    -      "User", "State", "Duration")
    +    val activeAppHeaders = Seq("Application ID", "Name", "Cores Using",
    --- End diff --
    
    more nit: `Cores in Use`? (lower case i)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-75894695
  
      [Test build #27926 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/27926/consoleFull) for   PR 4567 at commit [`694796e`](https://github.com/apache/spark/commit/694796ed65b7b0bd533c147d5a3561b6e575c9f3).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74141527
  
    Looks good for now. However, in the future if we want to extend dynamic allocation to standalone mode we will have executors coming and going all the time, in which case `coresMax` will be arbitrarily large and it might make less sense then. @marsishandsome also what happens if executors fail multiple times (but not enough to fail the application), do we count all those cores too?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-76314819
  
    @jerryshao Darn, yeah that needs to be fixed. `*` or `ALL` seem good to me. It can be displayed if `requestedCore == Int.MaxValue`. @marsishandsome what do you think about making a small follow on PR for that fix?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-76315541
  
    I think `*` is better, let me do a quick fix based on this JIRA.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4567#discussion_r24717170
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/master/ui/MasterPage.scala ---
    @@ -173,6 +177,30 @@ private[spark] class MasterPage(parent: MasterWebUI) extends WebUIPage("") {
           <td>
             {app.coresGranted}
           </td>
    +      <td>
    +        {app.requestedCores}
    +      </td>
    +      <td sorttable_customkey={app.desc.memoryPerSlave.toString}>
    +        {Utils.megabytesToString(app.desc.memoryPerSlave)}
    +      </td>
    +      <td>{UIUtils.formatDate(app.submitDate)}</td>
    +      <td>{app.desc.user}</td>
    +      <td>{app.state.toString}</td>
    +      <td>{UIUtils.formatDuration(app.duration)}</td>
    +    </tr>
    +  }
    +
    +  private def completeAppRow(app: ApplicationInfo): Seq[Node] = {
    +    <tr>
    --- End diff --
    
    Is there a way to refactor this to not duplicate most of the row markup? like save off two other `Seq[Node]` of the common prefix and suffix? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-76315240
  
    I think `--` might be more appropriate, since the user didn't specify the number of cores to request. `*` is also fine since there's `local[*]` with the same semantics (i.e. this app will grab all the cores available to it).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74755232
  
    See https://issues.apache.org/jira/browse/SPARK-5076 too; it suggests that in addition to removing the current cores column for completed apps, to remove the Memory per Node column too. WDYT?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5771] Number of Cores in Completed Appl...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/4567#issuecomment-74993560
  
    Hey @marsishandsome the new solution looks good. Once you address @srowen's in-line comments I will merge this. As for SPARK-5076, I think having information about the cores and executor memory actually helps the user identify his/her application. I am inclined to close that one as a won't fix and favor this PR.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org