You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/02/27 08:30:10 UTC

[GitHub] dilipbiswal commented on a change in pull request #23905: [SPARK-24669][SQL] Refresh table before drop database cascade

dilipbiswal commented on a change in pull request #23905: [SPARK-24669][SQL] Refresh table before drop database cascade
URL: https://github.com/apache/spark/pull/23905#discussion_r260636258
 
 

 ##########
 File path: sql/core/src/main/scala/org/apache/spark/sql/execution/command/ddl.scala
 ##########
 @@ -102,6 +102,12 @@ case class DropDatabaseCommand(
   extends RunnableCommand {
 
   override def run(sparkSession: SparkSession): Seq[Row] = {
+    val catalog = sparkSession.sessionState.catalog
+    if (cascade) {
+      catalog.listTables(databaseName).foreach { t =>
+        catalog.refreshTable(t)
+      }
+    }
     sparkSession.sessionState.catalog.dropDatabase(databaseName, ifExists, cascade)
 
 Review comment:
   On this call we can get an error. For example, its not allowed to drop the default database. In that case, even though we are not going to go ahead with the drop database action, we will end up refreshing all the tables inside it ? Is that expected ?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org