You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Daniel LaBar (JIRA)" <ji...@apache.org> on 2015/06/15 18:33:00 UTC

[jira] [Commented] (SPARK-6220) Allow extended EC2 options to be passed through spark-ec2

    [ https://issues.apache.org/jira/browse/SPARK-6220?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14586276#comment-14586276 ] 

Daniel LaBar commented on SPARK-6220:
-------------------------------------

[~nchammas], I also need IAM support and [made a few changes to spark_ec2.py|https://github.com/dnlbrky/spark/commit/5d4a9c65728245dc501c2a7c479ca27b6f685bd8], including an {{--instance-profile-name}} option.  These modifications let me successfully create security groups and the master/slaves without specifying an access key and secret, but I'm still having issues getting Hadoop/Yarn setup so it may require further changes.  Please let me know if you have suggestions.

This would be my first time contributing to an Apache project and I'm new to Spark/Python, so please forgive my greenness... Should I create another JIRA specifically to add instance profile support, or can I reference this JIRA when submitting a pull request?

> Allow extended EC2 options to be passed through spark-ec2
> ---------------------------------------------------------
>
>                 Key: SPARK-6220
>                 URL: https://issues.apache.org/jira/browse/SPARK-6220
>             Project: Spark
>          Issue Type: Improvement
>          Components: EC2
>            Reporter: Nicholas Chammas
>            Priority: Minor
>
> There are many EC2 options exposed by the boto library that spark-ec2 uses. 
> Over time, many of these EC2 options have been bubbled up here and there to become spark-ec2 options.
> Examples:
> * spot prices
> * placement groups
> * VPC, subnet, and security group assignments
> It's likely that more and more EC2 options will trickle up like this to become spark-ec2 options.
> While major options are well suited to this type of promotion, we should probably allow users to pass through EC2 options they want to use through spark-ec2 in some generic way.
> Let's add two options:
> * {{--ec2-instance-option}} -> [{{boto::run}}|http://boto.readthedocs.org/en/latest/ref/ec2.html#boto.ec2.image.Image.run]
> * {{--ec2-spot-instance-option}} -> [{{boto::request_spot_instances}}|http://boto.readthedocs.org/en/latest/ref/ec2.html#boto.ec2.connection.EC2Connection.request_spot_instances]
> Each option can be specified multiple times and is simply passed directly to the underlying boto call.
> For example:
> {code}
> spark-ec2 \
>     ...
>     --ec2-instance-option "instance_initiated_shutdown_behavior=terminate" \
>     --ec2-instance-option "ebs_optimized=True"
> {code}
> I'm not sure about the exact syntax of the extended options, but something like this will do the trick as long as it can be made to pass the options correctly to boto in most cases.
> I followed the example of {{ssh}}, which supports multiple extended options similarly.
> {code}
> ssh -o LogLevel=ERROR -o UserKnowHostsFile=/dev/null ...
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org