You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Noah Young (JIRA)" <ji...@apache.org> on 2014/12/27 20:35:13 UTC

[jira] [Created] (SPARK-4977) spark-ec2 start resets all the spark/conf configurations

Noah Young created SPARK-4977:
---------------------------------

             Summary: spark-ec2 start resets all the spark/conf configurations
                 Key: SPARK-4977
                 URL: https://issues.apache.org/jira/browse/SPARK-4977
             Project: Spark
          Issue Type: Bug
          Components: EC2
    Affects Versions: 1.2.0
            Reporter: Noah Young
            Priority: Minor


Running `spark-ec2 start` to restart an already-launched cluster causes the cluster setup scripts to be run, which reset any existing spark configuration files on the remote machines. The expected behavior is that all the modules (tachyon, hadoop, spark itself) should be restarted, and perhaps the master configuration copy-dir'd out, but anything in spark/conf should (at least optionally) be left alone.

As far as I know, one must create and execute their own init script to set all spark configurables as needed after restarting a cluster.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org