You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Fei Shao (JIRA)" <ji...@apache.org> on 2017/06/09 03:34:18 UTC

[jira] [Commented] (SPARK-20589) Allow limiting task concurrency per stage

    [ https://issues.apache.org/jira/browse/SPARK-20589?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16043877#comment-16043877 ] 

Fei Shao commented on SPARK-20589:
----------------------------------

Tasks are assigned to executors. If we set the number of executors to 5 and set the simulaneous task number to 2,  a contradiction occurs here.

So can we change the requirement to "allow limiting task concurrency per executor" please?

> Allow limiting task concurrency per stage
> -----------------------------------------
>
>                 Key: SPARK-20589
>                 URL: https://issues.apache.org/jira/browse/SPARK-20589
>             Project: Spark
>          Issue Type: Improvement
>          Components: Scheduler
>    Affects Versions: 2.1.0
>            Reporter: Thomas Graves
>
> It would be nice to have the ability to limit the number of concurrent tasks per stage.  This is useful when your spark job might be accessing another service and you don't want to DOS that service.  For instance Spark writing to hbase or Spark doing http puts on a service.  Many times you want to do this without limiting the number of partitions. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org