You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/12/14 12:10:46 UTC

[jira] [Assigned] (SPARK-11882) Allow for running Spark applications against a custom coarse grained scheduler

     [ https://issues.apache.org/jira/browse/SPARK-11882?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-11882:
------------------------------------

    Assignee: Apache Spark

> Allow for running Spark applications against a custom coarse grained scheduler
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-11882
>                 URL: https://issues.apache.org/jira/browse/SPARK-11882
>             Project: Spark
>          Issue Type: Wish
>          Components: Spark Core, Spark Submit
>            Reporter: Jacek Lewandowski
>            Assignee: Apache Spark
>            Priority: Minor
>
> SparkContext makes a decision which scheduler to use according to the Master URI. How about running applications against a custom scheduler? Such a custom scheduler would just extend {{CoarseGrainedSchedulerBackend}}. 
> The custom scheduler would be created by a provided factory. Factories would be defined in the configuration like {{spark.scheduler.factory.<name>=<factory-class>}}, where {{name}} is the scheduler name. {{SparkContext}}, once it learns that master address is not for standalone, Yarn, Mesos, local or any other predefined scheduler, it would resolve scheme from the provided master URI and look for the scheduler factory with the name equal to the resolved scheme. 
> For example:
> {{spark.scheduler.factory.custom=org.a.b.c.CustomSchedulerFactory}}
> then Master address would be {{custom://192.168.1.1}}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org