You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kazuaki Ishizaki (JIRA)" <ji...@apache.org> on 2016/04/17 20:36:25 UTC

[jira] [Commented] (SPARK-13904) Add support for pluggable cluster manager

    [ https://issues.apache.org/jira/browse/SPARK-13904?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15244805#comment-15244805 ] 

Kazuaki Ishizaki commented on SPARK-13904:
------------------------------------------

To merge this PR may have begun causing test failures. Would it be possible to look at these links?
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.6/627/
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.6/626/

cf. [SPARK-14690]

> Add support for pluggable cluster manager
> -----------------------------------------
>
>                 Key: SPARK-13904
>                 URL: https://issues.apache.org/jira/browse/SPARK-13904
>             Project: Spark
>          Issue Type: Improvement
>          Components: Scheduler
>            Reporter: Hemant Bhanawat
>
> Currently Spark allows only a few cluster managers viz Yarn, Mesos and Standalone. But, as Spark is now being used in newer and different use cases, there is a need for allowing other cluster managers to manage spark components. One such use case is - embedding spark components like executor and driver inside another process which may be a datastore. This allows colocation of data and processing. Another requirement that stems from such a use case is that the executors/driver should not take the parent process down when they go down and the components can be relaunched inside the same process again. 
> So, this JIRA requests two functionalities:
> 1. Support for external cluster managers
> 2. Allow a cluster manager to clean up the tasks without taking the parent process down. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org