You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Darin McBeath <dd...@yahoo.com.INVALID> on 2014/11/12 22:35:48 UTC

ec2 script and SPARK_LOCAL_DIRS not created

I'm using spark 1.1 and the provided ec2 scripts to start my cluster (r3.8xlarge machines).  From the spark-shell, I can verify that the environment variables are set
scala> System.getenv("SPARK_LOCAL_DIRS")res0: String = /mnt/spark,/mnt2/spark
However, when I look on the workers, the directories for /mnt/spark and /mnt2/spark do not exist.
Am I missing something?  Has anyone else noticed this?
A colleague was started a cluster (using the ec2 scripts) but for m3.xlarge machines and both /mnt/spark and /mnt2/spark directories were created.
Thanks.
Darin.