You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/02/01 13:31:21 UTC
[GitHub] [spark] LantaoJin edited a comment on issue #27185:
[SPARK-30494][SQL] Avoid duplicated cached RDD when replace an existing view
LantaoJin edited a comment on issue #27185: [SPARK-30494][SQL] Avoid duplicated cached RDD when replace an existing view
URL: https://github.com/apache/spark/pull/27185#issuecomment-581030414
> I think it makes sense, but we should follow how similar things are done in DROP TABLE.
I think they are similar with DROP TABLE
In DropTableCommand:
https://github.com/apache/spark/blob/da32d1e6b5cc409f408384576002ccf63a83e9a1/sql/core/src/main/scala/org/apache/spark/sql/execution/command/ddl.scala#L239
```scala
try {
sparkSession.sharedState.cacheManager.uncacheQuery(
sparkSession.table(tableName), cascade = !isTempView)
} catch {
case NonFatal(e) => log.warn(e.toString, e)
}
```
I added the `sparkSession.catalog.uncacheTable()` in views.scala, `uncacheTable()` has the similar logic:
https://github.com/apache/spark/blob/69ab94ff24f471783e29cc7853c0eee25ea2d88c/sql/core/src/main/scala/org/apache/spark/sql/execution/command/views.scala#L114
```scala
override def uncacheTable(tableName: String): Unit = {
val tableIdent = sparkSession.sessionState.sqlParser.parseTableIdentifier(tableName)
val cascade = !sessionCatalog.isTemporaryTable(tableIdent)
sparkSession.sharedState.cacheManager.uncacheQuery(sparkSession.table(tableName), cascade)
}
```
Can I just use `try..cache` to wrap the `uncacheTable()`?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org