You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/09/23 17:34:20 UTC

[jira] [Commented] (SPARK-17648) TaskSchedulerImpl.resourceOffers should take an IndexedSeq, not a Seq

    [ https://issues.apache.org/jira/browse/SPARK-17648?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15517063#comment-15517063 ] 

Sean Owen commented on SPARK-17648:
-----------------------------------

We recently cleaned up a lot of other instances of this type of thing in https://issues.apache.org/jira/browse/SPARK-17480. A possibly better solution is to just iterate over the thing rather than index by position, if possible.

> TaskSchedulerImpl.resourceOffers should take an IndexedSeq, not a Seq
> ---------------------------------------------------------------------
>
>                 Key: SPARK-17648
>                 URL: https://issues.apache.org/jira/browse/SPARK-17648
>             Project: Spark
>          Issue Type: Improvement
>          Components: Scheduler, Spark Core
>    Affects Versions: 2.0.0
>            Reporter: Imran Rashid
>            Assignee: Imran Rashid
>            Priority: Minor
>
> {{TaskSchedulerImpl.resourceOffer}} takes in a {{Seq[WorkerOffer]}}.  however, later on it indexes into this by position.  If you don't pass in an {{IndexedSeq}}, this turns an O(n) operation in an O(n^2) operation.
> In practice, this isn't an issue, since just by chance the important places this is called, the datastructures happen to already be {{IndexedSeq}}s.  But we ought to tighten up the types to make this more clear.  I ran into this while doing some performance tests on the scheduler, and performance was terrible when I passed in a {{Seq}} and even a few hundred offers were scheduled very slowly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org