You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2014/12/13 23:20:13 UTC

[jira] [Resolved] (SPARK-1574) ec2/spark_ec2.py should provide option to control number of attempts for ssh operations

     [ https://issues.apache.org/jira/browse/SPARK-1574?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Josh Rosen resolved SPARK-1574.
-------------------------------
    Resolution: Fixed
      Assignee: Nicholas Chammas

I think that SPARK-3398 should have fixed this in Spark 1.2.0, so I'm going to mark this issue as 'Fixed'.

> ec2/spark_ec2.py should provide option to control number of attempts for ssh operations
> ---------------------------------------------------------------------------------------
>
>                 Key: SPARK-1574
>                 URL: https://issues.apache.org/jira/browse/SPARK-1574
>             Project: Spark
>          Issue Type: Improvement
>          Components: EC2
>    Affects Versions: 0.9.0
>            Reporter: Art Peel
>            Assignee: Nicholas Chammas
>            Priority: Minor
>
> EC instances are sometimes slow to start up.  When this happens, generating the cluster ssh key or sending the generated cluster key to the slaves can fail due to an ssh timeout.
> The script currently hard-codes the number of tries for ssh operations as 2.
> For more flexibility, it should be possible to specify the number of tries with a command-line option, --num-ssh-tries, that defaults to 2 to keep the current behavior if not provided.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org