You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by luanjunyi <lu...@gmail.com> on 2014/09/10 04:41:49 UTC

Re: Spark EC2 standalone - Utils.fetchFile no such file or directory

I've encountered probably the same problem and just figured out the solution.

The error was caused because Spark tried to write to the scratch directory
but the path didn't exist.

It's likely you are running the app on the master node only. In the
spark-ec2 setting, the scratch directory for Spark(spark.local.dir) is set
to /mnt/spark in conf/spark-env.sh. This path exists on all slave nodes but
not the master node, hence the error.

So if you set the master URL to spark://your-master-node-domain:7077, the
error will be gone since all the slave instances are in slave nodes. If you
need to test on the master node, either create /mnt/spark your self or
change the entry(SPARK_LOCAL_DIRS) in conf/spark-env.sh to some existing
path with write permission.

Note that the environment variables defined in conf/spark-env.sh are meant
for machine-specific settings thus they will override the settings in the
SparkConf object even if you provided one.




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-EC2-standalone-Utils-fetchFile-no-such-file-or-directory-tp12683p13848.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org