You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by jo...@apache.org on 2015/04/08 19:15:01 UTC

spark git commit: [SPARK-6506] [pyspark] Do not try to retrieve SPARK_HOME when not needed...

Repository: spark
Updated Branches:
  refs/heads/master 15e0d2bd1 -> f7e21dd1e


[SPARK-6506] [pyspark] Do not try to retrieve SPARK_HOME when not needed...

....

In particular, this makes pyspark in yarn-cluster mode fail unless
SPARK_HOME is set, when it's not really needed.

Author: Marcelo Vanzin <va...@cloudera.com>

Closes #5405 from vanzin/SPARK-6506 and squashes the following commits:

e184507 [Marcelo Vanzin] [SPARK-6506] [pyspark] Do not try to retrieve SPARK_HOME when not needed.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f7e21dd1
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f7e21dd1
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/f7e21dd1

Branch: refs/heads/master
Commit: f7e21dd1ec4541be54eb01d8b15cfcc6714feed0
Parents: 15e0d2b
Author: Marcelo Vanzin <va...@cloudera.com>
Authored: Wed Apr 8 10:14:52 2015 -0700
Committer: Josh Rosen <jo...@databricks.com>
Committed: Wed Apr 8 10:14:52 2015 -0700

----------------------------------------------------------------------
 python/pyspark/java_gateway.py | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/f7e21dd1/python/pyspark/java_gateway.py
----------------------------------------------------------------------
diff --git a/python/pyspark/java_gateway.py b/python/pyspark/java_gateway.py
index 0a16cbd..2a5e84a 100644
--- a/python/pyspark/java_gateway.py
+++ b/python/pyspark/java_gateway.py
@@ -29,11 +29,10 @@ from pyspark.serializers import read_int
 
 
 def launch_gateway():
-    SPARK_HOME = os.environ["SPARK_HOME"]
-
     if "PYSPARK_GATEWAY_PORT" in os.environ:
         gateway_port = int(os.environ["PYSPARK_GATEWAY_PORT"])
     else:
+        SPARK_HOME = os.environ["SPARK_HOME"]
         # Launch the Py4j gateway using Spark's run command so that we pick up the
         # proper classpath and settings from spark-env.sh
         on_windows = platform.system() == "Windows"


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org