You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/01/28 14:43:39 UTC

[jira] [Resolved] (SPARK-4977) spark-ec2 start resets all the spark/conf configurations

     [ https://issues.apache.org/jira/browse/SPARK-4977?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-4977.
------------------------------
    Resolution: Won't Fix

> spark-ec2 start resets all the spark/conf configurations
> --------------------------------------------------------
>
>                 Key: SPARK-4977
>                 URL: https://issues.apache.org/jira/browse/SPARK-4977
>             Project: Spark
>          Issue Type: Bug
>          Components: EC2
>    Affects Versions: 1.2.0
>            Reporter: Noah Young
>            Priority: Minor
>
> Running `spark-ec2 start` to restart an already-launched cluster causes the cluster setup scripts to be run, which reset any existing spark configuration files on the remote machines. The expected behavior is that all the modules (tachyon, hadoop, spark itself) should be restarted, and perhaps the master configuration copy-dir'd out, but anything in spark/conf should (at least optionally) be left alone.
> As far as I know, one must create and execute their own init script to set all spark configurables as needed after restarting a cluster.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org