You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Littlestar (JIRA)" <ji...@apache.org> on 2015/04/29 17:15:07 UTC
[jira] [Resolved] (SPARK-6461) spark.executorEnv.PATH in
spark-defaults.conf is not pass to mesos
[ https://issues.apache.org/jira/browse/SPARK-6461?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Littlestar resolved SPARK-6461.
-------------------------------
Resolution: Duplicate
missing enviroment on spark-1.3.1-hadoop2.4.0.tgz.
MARK here:
spark-env.sh on driver node only effects on spark-submit(driver node).
spark-1.3.1-hadoop2.4.0.tgz/conf/spark-env.sh work well with worker node.
> spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
> ------------------------------------------------------------------
>
> Key: SPARK-6461
> URL: https://issues.apache.org/jira/browse/SPARK-6461
> Project: Spark
> Issue Type: Bug
> Components: Scheduler
> Affects Versions: 1.3.0
> Reporter: Littlestar
>
> I use mesos run spak 1.3.0 ./run-example SparkPi
> but failed.
> spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
> spark.executorEnv.PATH
> spark.executorEnv.HADOOP_HOME
> spark.executorEnv.JAVA_HOME
> E0323 14:24:36.400635 11355 fetcher.cpp:109] HDFS copyToLocal failed: hadoop fs -copyToLocal 'hdfs://192.168.1.9:54310/home/test/spark-1.3.0-bin-2.4.0.tar.gz' '/home/mesos/work_dir/slaves/20150323-100710-1214949568-5050-3453-S3/frameworks/20150323-133400-1214949568-5050-15440-0007/executors/20150323-100710-1214949568-5050-3453-S3/runs/915b40d8-f7c4-428a-9df8-ac9804c6cd21/spark-1.3.0-bin-2.4.0.tar.gz'
> sh: hadoop: command not found
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org