You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Al Thompson <at...@gmail.com> on 2014/12/17 14:58:40 UTC

spark-ec2 starts hdfs1, tachyon but not spark

Hi All:

I am new to Spark. I recently checked out and built spark 1.2 RC2 as an
assembly.
I then ran spark-ec2 according to:

http://spark.apache.org/docs/latest/ec2-scripts.html

I got master and slave instances in EC2 after running

./src/spark/ec2/spark-ec2 -k mykey -i mykey.pem -s 1 launch myclus

All seem to run OK. However, I got no web UI's for spark master or slave.
Logging into the nodes, I see HDFS and Tachyon processes but none for Spark.

The /root/tachyon folder has a full complement of files including conf,
logs and so forth:

$ ls /root/tachyon
bin   docs     libexec  logs     README.md  target
conf  journal  LICENSE  pom.xml  src

The /root/spark folder only has a conf dir:

$ ls /root/spark
conf

If I try to run the spark setup script I see errors like::

Setting up spark-standalone
RSYNC'ing /root/spark/conf to slaves...
ec2-some-ip.compute-1.amazonaws.com
./spark-standalone/setup.sh: line 22: /root/spark/sbin/stop-all.sh: No such
file or directory
./spark-standalone/setup.sh: line 27: /root/spark/sbin/start-master.sh: No
such file or directory
./spark-standalone/setup.sh: line 33: /root/spark/sbin/start-slaves.sh: No
such file or directory

This makes it seem that something did not get unpacked properly for Spark.
Any hints or workarounds to fixing this?

Cheers,
Al