You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/01/18 11:19:39 UTC

[jira] [Resolved] (SPARK-2962) Suboptimal scheduling in spark

     [ https://issues.apache.org/jira/browse/SPARK-2962?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-2962.
------------------------------
    Resolution: Not A Problem

Given the final comment, sounds like there is a belief this was in fact addressed.

> Suboptimal scheduling in spark
> ------------------------------
>
>                 Key: SPARK-2962
>                 URL: https://issues.apache.org/jira/browse/SPARK-2962
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.1.0
>         Environment: All
>            Reporter: Mridul Muralidharan
>
> In findTask, irrespective of 'locality' specified, pendingTasksWithNoPrefs are always scheduled with PROCESS_LOCAL
> pendingTasksWithNoPrefs contains tasks which currently do not have any alive locations - but which could come in 'later' : particularly relevant when spark app is just coming up and containers are still being added.
> This causes a large number of non node local tasks to be scheduled incurring significant network transfers in the cluster when running with non trivial datasets.
> The comment "// Look for no-pref tasks after rack-local tasks since they can run anywhere." is misleading in the method code : locality levels start from process_local down to any, and so no prefs get scheduled much before rack.
> Also note that, currentLocalityIndex is reset to the taskLocality returned by this method - so returning PROCESS_LOCAL as the level will trigger wait times again. (Was relevant before recent change to scheduler, and might be again based on resolution of this issue).
> Found as part of writing test for SPARK-2931
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org