You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Daniel Darabos (JIRA)" <ji...@apache.org> on 2014/08/21 14:48:11 UTC

[jira] [Commented] (SPARK-2291) Update EC2 scripts to use instance storage on m3 instance types

    [ https://issues.apache.org/jira/browse/SPARK-2291?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14105342#comment-14105342 ] 

Daniel Darabos commented on SPARK-2291:
---------------------------------------

I don't know if something has changed on Amazon's end or if I'm missing something. (I'm pretty clueless.) But we still see missing SSDs. This change fixed it for us: https://github.com/apache/spark/pull/2081/files. The block device mapping entries are necessary according to http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/InstanceStorage.html#InstanceStore_UsageScenarios.

I guess you tested PR #1156. Actually it seemed to have worked for us too for a while. But now some of the machines come up without SSDs. (/dev/sdb and /dev/sdc do not exist.) So I read the docs and tried adding the block device mappings. Seems to work. With PR #2081 all machines have the SSDs.

Hope this makes sense.

> Update EC2 scripts to use instance storage on m3 instance types
> ---------------------------------------------------------------
>
>                 Key: SPARK-2291
>                 URL: https://issues.apache.org/jira/browse/SPARK-2291
>             Project: Spark
>          Issue Type: Improvement
>          Components: EC2
>    Affects Versions: 0.9.0, 0.9.1, 1.0.0
>            Reporter: Alessandro Andrioni
>
> [On January 21|https://aws.amazon.com/about-aws/whats-new/2014/01/21/announcing-new-amazon-ec2-m3-instance-sizes-and-lower-prices-for-amazon-s3-and-amazon-ebs/], Amazon added SSD-backed instance storages for m3 instances, and also added two new types: m3.medium and m3.large.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org