You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by vanzin <gi...@git.apache.org> on 2018/02/08 21:27:24 UTC

[GitHub] spark pull request #20546: [SPARK-20659][Core] Removing sc.getExecutorStorag...

Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20546#discussion_r167072049
  
    --- Diff: core/src/test/scala/org/apache/spark/DistributedSuite.scala ---
    @@ -160,10 +160,6 @@ class DistributedSuite extends SparkFunSuite with Matchers with LocalSparkContex
         val data = sc.parallelize(1 to 1000, 10)
         val cachedData = data.persist(storageLevel)
         assert(cachedData.count === 1000)
    -    assert(sc.getExecutorStorageStatus.map(_.rddBlocksById(cachedData.id).size).sum ===
    --- End diff --
    
    You could replace these with code based on `sc.statusStore`.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org