You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/01/08 12:40:00 UTC

[jira] [Resolved] (SPARK-22988) Why does dataset's unpersist clear all the caches have the same logical plan?

     [ https://issues.apache.org/jira/browse/SPARK-22988?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-22988.
----------------------------------
    Resolution: Invalid

Usually, questions are encouraged to go to mailing list. Let's ask a question there - https://spark.apache.org/community.html. I believe you could have a better answer. 

> Why does dataset's unpersist clear all the caches have the same logical plan?
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-22988
>                 URL: https://issues.apache.org/jira/browse/SPARK-22988
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Core
>    Affects Versions: 2.1.1
>            Reporter: Wang Cheng
>            Priority: Minor
>
> When I do followings:
> dataset A = some dataset
> A.persist
> dataset B = A.doSomthing
> dataset C = A.doSomthing
> C.persist
> A.unpersist
> I found C's cache is removed too, since following code:
> def uncacheQuery(spark: SparkSession, plan: LogicalPlan, blocking: Boolean): Unit = writeLock {
>     val it = cachedData.iterator()
>     while (it.hasNext) {
>       val cd = it.next()
>       if (cd.plan.find(_.sameResult(plan)).isDefined) {
>         cd.cachedRepresentation.cachedColumnBuffers.unpersist(blocking)
>         it.remove()
>       }
>     }
>   }
> It removes the data caches contain the same logical plan, should it only remove the cache whose dataset calls unpersist method? 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org