You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/04/30 15:05:13 UTC

[GitHub] [spark] William1104 commented on a change in pull request #24221: [SPARK-27248][SQL] refresh table should recreate cache with same cache name and storage level

William1104 commented on a change in pull request #24221: [SPARK-27248][SQL] refresh table should recreate cache with same cache name and storage level
URL: https://github.com/apache/spark/pull/24221#discussion_r279797855
 
 

 ##########
 File path: sql/core/src/test/scala/org/apache/spark/sql/QueryTest.scala
 ##########
 @@ -205,6 +206,21 @@ abstract class QueryTest extends PlanTest {
         planWithCaching)
   }
 
+  /**
+   * Asserts that a given [[Dataset]] will be executed using the given named cache.
+   */
+  def assertCached(query: Dataset[_], cachedName: String, storageLevel: StorageLevel): Unit = {
+    val planWithCaching = query.queryExecution.withCachedData
+    val matched = planWithCaching.find( cached =>
 
 Review comment:
   Hi @srowen , 
   
   We cannot use `exists { cached => ` here. It is  because planWithCaching is not a `Seq`, but a `LogicalPlan`. It seems that we can traverse it with only methods `collect` or `collectFirst`  
   
   I updated the line to 
   ```
   val matched = planWithCaching.collectFirst { case cached: InMemoryRelation =>
         val cacheBuilder = cached.asInstanceOf[InMemoryRelation].cacheBuilder
         cacheBuilder.tableName.get == cachedName && cacheBuilder.storageLevel == storageLevel
       }.nonEmpty
   ```
   It hope it looks better. 
   
   Thanks and regards,
   William
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org