You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by an...@apache.org on 2015/05/28 21:40:17 UTC

spark git commit: [MINOR] [CORE] Warn about caching if dynamic allocation is enabled (1.3)

Repository: spark
Updated Branches:
  refs/heads/branch-1.3 68387e357 -> 33e1539b3


[MINOR] [CORE] Warn about caching if dynamic allocation is enabled (1.3)

This is a resubmit of #5751 for branch-1.3. The previous cherry-pick caused a build break that was later [reverted](https://github.com/apache/spark/commit/2254576e10ee433423aa8accf2d84f12ec20fc97). Originally written by vanzin.

Author: Andrew Or <an...@databricks.com>

Closes #6421 from andrewor14/warn-da-cache-1.3 and squashes the following commits:

25cbb53 [Andrew Or] If DA is enabled, warn about caching


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/33e1539b
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/33e1539b
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/33e1539b

Branch: refs/heads/branch-1.3
Commit: 33e1539b326b9b2c1053c6b0b54f4b9d2ded821a
Parents: 68387e3
Author: Andrew Or <an...@databricks.com>
Authored: Thu May 28 12:40:13 2015 -0700
Committer: Andrew Or <an...@databricks.com>
Committed: Thu May 28 12:40:13 2015 -0700

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/SparkContext.scala | 5 +++++
 1 file changed, 5 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/33e1539b/core/src/main/scala/org/apache/spark/SparkContext.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala b/core/src/main/scala/org/apache/spark/SparkContext.scala
index d5e9168..47818ee 100644
--- a/core/src/main/scala/org/apache/spark/SparkContext.scala
+++ b/core/src/main/scala/org/apache/spark/SparkContext.scala
@@ -1293,6 +1293,11 @@ class SparkContext(config: SparkConf) extends Logging with ExecutorAllocationCli
    * Register an RDD to be persisted in memory and/or disk storage
    */
   private[spark] def persistRDD(rdd: RDD[_]) {
+    executorAllocationManager.foreach { _ =>
+      logWarning(
+        s"Dynamic allocation currently does not support cached RDDs. Cached data for RDD " +
+        s"${rdd.id} will be lost when executors are removed.")
+    }
     persistentRdds(rdd.id) = rdd
   }
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org