You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2014/11/25 13:03:12 UTC
[jira] [Resolved] (SPARK-2404) spark-submit and spark-class may
overwrite the already defined SPARK_HOME
[ https://issues.apache.org/jira/browse/SPARK-2404?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-2404.
------------------------------
Resolution: Won't Fix
Fix Version/s: (was: 1.0.1)
According to the PR discussion, this appears to be a WontFix. The problem of SPARK_HOME not matching the remote cluster directories was to be resolved in another PR.
> spark-submit and spark-class may overwrite the already defined SPARK_HOME
> -------------------------------------------------------------------------
>
> Key: SPARK-2404
> URL: https://issues.apache.org/jira/browse/SPARK-2404
> Project: Spark
> Issue Type: Bug
> Affects Versions: 1.0.0
> Reporter: Nan Zhu
> Assignee: Nan Zhu
>
> in spark-class and spark-submit
> the SPARK_HOME is set to the present working directory, causing the value of already defined SPARK_HOME being overwritten
> we should not overwrite that if SPARK_HOME has been defined
> Our scenario
> we have a login portal for all the team members to use the spark-cluster, everyone gets an account and home directory
> spark-1.0 is copied in the root path "/", every account gets a soft link to the /spark-1.0 in it's home directory
> spark 1.0 is deployed in a cluster, where the user name is different with any accounts except one, say "nanzhu", in login portal,
> so when the user runs spark-shell, it always tries to run /home/user_account/spark-1.0/bin/compute-class.sh, which does not exist.
> We set a global SPARK_HOME to /home/nanzhu/spark-1.0 globally which is consistent with the remote cluster setup, but unfortunately, this is overwritten by the spark-class and spark-submit ....
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org