You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Marco Mistroni <mm...@gmail.com> on 2016/07/25 14:37:40 UTC

Pls assist: Creating Spak EC2 cluster using spark_ec2.py script and a custom AMI

HI all
 i was wondering if anyone can help with this
I Have created a spark cluster before using spark_ec2.py script from Spark
1.6.1
that by default uses a very old AMI... so i decided to try to launch the
script with a more up to date
AMI.
the one i have used is ami-d732f0b7, which refers to Ubuntu Server 14.04
LTS (HVM), SSD Volume Type



I have lauched the script as follows

./spark-ec2  --key-pair=ec2AccessKey --identity-file ec2AccessKey.pem
--region=us-west-2 --ami ami-d732f0b7 launch my-spark-cluster

but i am gettign this exception:

Non-Windows instances with a virtualization type of 'hvm' are currently not
supported for this instance type.

which seems a bizarre exception to me as , in spark_ec2.py , instance
m1.large (the one used to create spark master and nodes) is associated with
vritualization=pvm

"m1.large":    "pvm"


has anyone found similar issue? any suggestion on how can i use a custom
AMI when creating a spark cluster?

kind regards
 marco

Re: Pls assist: Creating Spak EC2 cluster using spark_ec2.py script and a custom AMI

Posted by Mayank Ahuja <ma...@qubole.com>.
Hi Marco,

From AMI name shared, it seems to be HVM image. 'm1' instance family does
not support HVM (only PV is supported). Either you can use PV equivalent of
this image or you can use 'm3' family (easiest transition from m1 to m3, if
possible).

Details:
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/virtualization_types.html

Thanks
Mayank

On Mon, Jul 25, 2016 at 7:37 AM, Marco Mistroni <mm...@gmail.com> wrote:

> HI all
>  i was wondering if anyone can help with this
> I Have created a spark cluster before using spark_ec2.py script from Spark
> 1.6.1
> that by default uses a very old AMI... so i decided to try to launch the
> script with a more up to date
> AMI.
> the one i have used is ami-d732f0b7, which refers to Ubuntu Server 14.04
> LTS (HVM), SSD Volume Type
>
>
>
> I have lauched the script as follows
>
> ./spark-ec2  --key-pair=ec2AccessKey --identity-file ec2AccessKey.pem
> --region=us-west-2 --ami ami-d732f0b7 launch my-spark-cluster
>
> but i am gettign this exception:
>
> Non-Windows instances with a virtualization type of 'hvm' are currently
> not supported for this instance type.
>
> which seems a bizarre exception to me as , in spark_ec2.py , instance
> m1.large (the one used to create spark master and nodes) is associated with
> vritualization=pvm
>
> "m1.large":    "pvm"
>
>
> has anyone found similar issue? any suggestion on how can i use a custom
> AMI when creating a spark cluster?
>
> kind regards
>  marco
>
>
>