You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Grzegorz Dubicki (JIRA)" <ji...@apache.org> on 2015/01/27 16:51:35 UTC
[jira] [Commented] (SPARK-2008) Enhance spark-ec2 to be able to add
and remove slaves to an existing cluster
[ https://issues.apache.org/jira/browse/SPARK-2008?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14293708#comment-14293708 ]
Grzegorz Dubicki commented on SPARK-2008:
-----------------------------------------
Isn't this implemented already? If not then what is {{--use-existing-master}} option for?
> Enhance spark-ec2 to be able to add and remove slaves to an existing cluster
> ----------------------------------------------------------------------------
>
> Key: SPARK-2008
> URL: https://issues.apache.org/jira/browse/SPARK-2008
> Project: Spark
> Issue Type: New Feature
> Components: EC2
> Affects Versions: 1.0.0
> Reporter: Nicholas Chammas
> Priority: Minor
>
> Per [the discussion here|http://apache-spark-user-list.1001560.n3.nabble.com/Having-spark-ec2-join-new-slaves-to-existing-cluster-td3783.html]:
> {quote}
> I would like to be able to use spark-ec2 to launch new slaves and add them to an existing, running cluster. Similarly, I would also like to remove slaves from an existing cluster.
> Use cases include:
> * Oh snap, I sized my cluster incorrectly. Let me add/remove some slaves.
> * During scheduled batch processing, I want to add some new slaves, perhaps on spot instances. When that processing is done, I want to kill them. (Cruel, I know.)
> I gather this is not possible at the moment. spark-ec2 appears to be able to launch new slaves for an existing cluster only if the master is stopped. I also do not see any ability to remove slaves from a cluster.
> {quote}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org