You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sr...@apache.org on 2015/10/30 19:50:21 UTC

spark git commit: [SPARK-11342][TESTS] Allow to set hadoop profile when running dev/ru…

Repository: spark
Updated Branches:
  refs/heads/master 40c77fb23 -> 729f983e6


[SPARK-11342][TESTS] Allow to set hadoop profile when running dev/ru…

…n_tests

Author: Jeff Zhang <zj...@apache.org>

Closes #9295 from zjffdu/SPARK-11342.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/729f983e
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/729f983e
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/729f983e

Branch: refs/heads/master
Commit: 729f983e66cf65da2e8f48c463ccde2b355240c4
Parents: 40c77fb
Author: Jeff Zhang <zj...@apache.org>
Authored: Fri Oct 30 18:50:12 2015 +0000
Committer: Sean Owen <so...@cloudera.com>
Committed: Fri Oct 30 18:50:12 2015 +0000

----------------------------------------------------------------------
 dev/run-tests.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/729f983e/dev/run-tests.py
----------------------------------------------------------------------
diff --git a/dev/run-tests.py b/dev/run-tests.py
index 6b4b710..9e1abb0 100755
--- a/dev/run-tests.py
+++ b/dev/run-tests.py
@@ -486,7 +486,7 @@ def main():
     else:
         # else we're running locally and can use local settings
         build_tool = "sbt"
-        hadoop_version = "hadoop2.3"
+        hadoop_version = os.environ.get("HADOOP_PROFILE", "hadoop2.3")
         test_env = "local"
 
     print("[info] Using build tool", build_tool, "with Hadoop profile", hadoop_version,


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org