You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/08/26 11:15:57 UTC

[GitHub] [spark] hemanthmeka commented on issue #25570: [SPARK-23519][SQL] create view should work from query with duplicate output columns

hemanthmeka commented on issue #25570: [SPARK-23519][SQL] create view should work from query with duplicate output columns
URL: https://github.com/apache/spark/pull/25570#issuecomment-524820928
 
 
   > > ALTER VIEW AS gets its schema from the new SELECT query provided. So it is not effected
   > 
   > But `sql("ALTER VIEW v23519 AS SELECT c1, c1 FROM t23519")` also fails with an exception?
   > 
   > ```scala
   > org.apache.spark.sql.AnalysisException: Found duplicate column(s) in the view definition: `c1`;
   > [info]   at org.apache.spark.sql.util.SchemaUtils$.checkColumnNameDuplication(SchemaUtils.scala:90)                                                   
   > [info]   at org.apache.spark.sql.util.SchemaUtils$.checkColumnNameDuplication(SchemaUtils.scala:70)                                                   
   > [info]   at org.apache.spark.sql.execution.command.ViewHelper$.generateViewProperties(views.scala:369)
   > [info]   at org.apache.spark.sql.execution.command.AlterViewAsCommand.alterPermanentView(views.scala:301)
   > ```
   
   From the comments in CreateViewCommand.run('**Nothing we need to retain from the old view, so ...**') i get the impression that alter always creates a new definition with schema from query, then c1, c1 should throw analysis exception. Which should be the right behaviour? Any reference?
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org