You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Grzegorz Dubicki (JIRA)" <ji...@apache.org> on 2015/01/17 21:53:34 UTC

[jira] [Comment Edited] (SPARK-5298) Spark not starting on EC2 using spark-ec2

    [ https://issues.apache.org/jira/browse/SPARK-5298?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14281535#comment-14281535 ] 

Grzegorz Dubicki edited comment on SPARK-5298 at 1/17/15 8:53 PM:
------------------------------------------------------------------

Ad. 1. I am sorry, I have not noticed the warnings. I would not use unsupported instance if I would knew that. It would be nice if the script would ask me something like "Not supported instance type. Continue anyway?"...

But switching to m3.medium didn't help. Launch output still includes the "ERROR: Unknown Spark version" message. See it whole here: https://gist.github.com/grzegorz-dubicki/4959eb97f9b1ca8e00ad

And still there is actually no Spark on the master:
{noformat}
root@ip-172-31-47-137 ~]$ ls spark
conf  work
{noformat}

Trying to apply your suggestion no 2...


was (Author: grzegorz-dubicki):
Ad. 1. I am sorry, I have not noticed the warnings. I would not use unsupported instance if I would knew that. It would be nice if the script would ask me something like "Not supported instance type. Continue anyway?"...

But switching to m3.medium didn't help. Launch output still includes the "ERROR: Unknown Spark version" message. See it whole here: https://gist.github.com/grzegorz-dubicki/4959eb97f9b1ca8e00ad

And still there is actually no Spark on the master:
{noformat}
root@ip-172-31-47-137 ~]$ ls spark
conf  work
{noformat}

Trying to apply your suggestion no 2...

> Spark not starting on EC2 using spark-ec2
> -----------------------------------------
>
>                 Key: SPARK-5298
>                 URL: https://issues.apache.org/jira/browse/SPARK-5298
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.2.0
>         Environment: I use Spark 1.2.0 + this PR https://github.com/mesos/spark-ec2/pull/76 from my fork https://github.com/grzegorz-dubicki/spark and v4 Spark EC2 script with the same fix from https://github.com/grzegorz-dubicki/spark-ec2
>            Reporter: Grzegorz Dubicki
>
> Spark doesn't start after creating it with:
> {noformat}
> ./spark-ec2 -k * -i * -s 1 --region=eu-west-1 --instance-type=t2.micro --spark-version=1.2.0 launch test2
> {noformat}
> (Output: https://gist.github.com/grzegorz-dubicki/f15caf9ff6c96ec69fee)
> ..or after stopping the instances on EC2 via AWS Console and starting the cluster with:
> {noformat}
> ./spark-ec2 -k * -i * --region=eu-west-1 start test2
> {noformat}
> (Output: https://gist.github.com/grzegorz-dubicki/8b87192b3aa4e0ed028c)
> Please note these errors in launch output:
> {noformat}
> ~/spark-ec2
> Initializing spark
> ~ ~/spark-ec2
> ERROR: Unknown Spark version
> Initializing shark
> ~ ~/spark-ec2 ~/spark-ec2
> ERROR: Unknown Shark version
> {noformat}
> ..and then these in start output:
> {noformat}
> ./spark-standalone/setup.sh: line 26: /root/spark/sbin/stop-all.sh: Nie ma takiego pliku ani katalogu
> ./spark-standalone/setup.sh: line 31: /root/spark/sbin/start-master.sh: Nie ma takiego pliku ani katalogu
> ./spark-standalone/setup.sh: line 37: /root/spark/sbin/start-slaves.sh: Nie ma takiego pliku ani katalogu
> {noformat}
> (the error message is "No such file or directory", in Polish)
> It seems to be related with http://mail-archives.us.apache.org/mod_mbox/spark-user/201412.mbox/%3cCAJ5A9B_U=mDCXYftDkbk+sLJzBCdPcb0qQS83u0grOzfgkcEow@mail.gmail.com%3e - I also have almost empty Spark and Shark dirs on the master of test2 cluster:
> {noformat}
> root@ip-172-31-7-179 ~]$ ls spark
> conf  work
> root@ip-172-31-7-179 ~]$ ls shark/
> conf
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org