You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/05/21 04:04:00 UTC

[jira] [Assigned] (SPARK-7779) Dynamic allocation: confusing message when canceling requests

     [ https://issues.apache.org/jira/browse/SPARK-7779?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-7779:
-----------------------------------

    Assignee: Andrew Or  (was: Apache Spark)

> Dynamic allocation: confusing message when canceling requests
> -------------------------------------------------------------
>
>                 Key: SPARK-7779
>                 URL: https://issues.apache.org/jira/browse/SPARK-7779
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.0
>            Reporter: Andrew Or
>            Assignee: Andrew Or
>
> Right now a long-running job with DA outputs the following. This is somewhat confusing to the user (why is my new desired total dropping??)
> {code}
> INFO ExecutorAllocationManager: Requesting 0 new executor(s) because tasks are backlogged (new desired total will be 46)
> INFO ExecutorAllocationManager: Requesting 0 new executor(s) because tasks are backlogged (new desired total will be 44)
> INFO ExecutorAllocationManager: Requesting 0 new executor(s) because tasks are backlogged (new desired total will be 42)
> INFO ExecutorAllocationManager: Requesting 0 new executor(s) because tasks are backlogged (new desired total will be 40)
> INFO ExecutorAllocationManager: Requesting 0 new executor(s) because tasks are backlogged (new desired total will be 38)
> INFO ExecutorAllocationManager: Requesting 0 new executor(s) because tasks are backlogged (new desired total will be 36)
> ...
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org