You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by an...@apache.org on 2014/09/04 03:39:18 UTC

git commit: [SPARK-3303][core] fix SparkContextSchedulerCreationSuite test error

Repository: spark
Updated Branches:
  refs/heads/master a52240792 -> e08ea7393


[SPARK-3303][core] fix SparkContextSchedulerCreationSuite test error

run test with the master branch with this command when mesos native lib is set
sbt/sbt -Phive "test-only org.apache.spark.SparkContextSchedulerCreationSuite"

get this error:
[info] SparkContextSchedulerCreationSuite:
[info] - bad-master
[info] - local
[info] - local-*
[info] - local-n
[info] - local--n-failures
[info] - local-n-failures
[info] - bad-local-n
[info] - bad-local-n-failures
[info] - local-default-parallelism
[info] - simr
[info] - local-cluster
[info] - yarn-cluster
[info] - yarn-standalone
[info] - yarn-client
[info] - mesos fine-grained
[info] - mesos coarse-grained ** FAILED ***
[info] Executor Spark home `spark.mesos.executor.home` is not set!

Since `executorSparkHome` only used in `createCommand`, move `val executorSparkHome...` to `createCommand` to fix this issue.

Author: scwf <wa...@huawei.com>
Author: wangfei <wa...@126.com>

Closes #2199 from scwf/SparkContextSchedulerCreationSuite and squashes the following commits:

ef1de22 [scwf] fix code fomate
19d26f3 [scwf] fix conflict
d9a8a60 [wangfei] fix SparkContextSchedulerCreationSuite test error


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/e08ea739
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/e08ea739
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/e08ea739

Branch: refs/heads/master
Commit: e08ea7393df46567f552aa67c60a690c231775e4
Parents: a522407
Author: scwf <wa...@huawei.com>
Authored: Wed Sep 3 18:39:13 2014 -0700
Committer: Andrew Or <an...@gmail.com>
Committed: Wed Sep 3 18:39:13 2014 -0700

----------------------------------------------------------------------
 .../cluster/mesos/CoarseMesosSchedulerBackend.scala       | 10 +++++-----
 1 file changed, 5 insertions(+), 5 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/e08ea739/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala b/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala
index da43ef5..6456840 100644
--- a/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala
+++ b/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala
@@ -71,11 +71,6 @@ private[spark] class CoarseMesosSchedulerBackend(
   val taskIdToSlaveId = new HashMap[Int, String]
   val failuresBySlaveId = new HashMap[String, Int] // How many times tasks on each slave failed
 
-  val executorSparkHome = conf.getOption("spark.mesos.executor.home")
-    .orElse(sc.getSparkHome())
-    .getOrElse {
-      throw new SparkException("Executor Spark home `spark.mesos.executor.home` is not set!")
-    }
 
   val extraCoresPerSlave = conf.getInt("spark.mesos.extra.cores", 0)
 
@@ -112,6 +107,11 @@ private[spark] class CoarseMesosSchedulerBackend(
   }
 
   def createCommand(offer: Offer, numCores: Int): CommandInfo = {
+    val executorSparkHome = conf.getOption("spark.mesos.executor.home")
+      .orElse(sc.getSparkHome())
+      .getOrElse {
+        throw new SparkException("Executor Spark home `spark.mesos.executor.home` is not set!")
+      }
     val environment = Environment.newBuilder()
     val extraClassPath = conf.getOption("spark.executor.extraClassPath")
     extraClassPath.foreach { cp =>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org