You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/03/23 04:11:32 UTC

[GitHub] [spark] imback82 commented on a change in pull request #31933: [SPARK-34701][SQL] Remove analyzing temp view again in CreateViewCommand

imback82 commented on a change in pull request #31933:
URL: https://github.com/apache/spark/pull/31933#discussion_r599256804



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/execution/command/views.scala
##########
@@ -62,15 +62,17 @@ case class CreateViewCommand(
     comment: Option[String],
     properties: Map[String, String],
     originalText: Option[String],
-    child: LogicalPlan,
+    analyzedPlan: LogicalPlan,
     allowExisting: Boolean,
     replace: Boolean,
     viewType: ViewType)
   extends RunnableCommand {
 
   import ViewHelper._
 
-  override def innerChildren: Seq[QueryPlan[_]] = Seq(child)
+  override def plansToCheckAnalysis: Seq[LogicalPlan] = Seq(analyzedPlan)

Review comment:
       We need to run checkAnalysis on the analyzed plan, otherwise, for the following:
   ```
   sql("CREATE TABLE view_base_table (key int, data varchar(20)) USING PARQUET")
   sql("CREATE VIEW key_dependent_view AS SELECT * FROM view_base_table GROUP BY key")
   ```
   , view creation works fine, whereas it should have failed with:
   ```
   org.apache.spark.sql.AnalysisException
   expression 'spark_catalog.default.view_base_table.data' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you get.
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org