You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2018/01/25 01:11:00 UTC

[jira] [Commented] (SPARK-23201) Cannot create view when duplicate columns exist in subquery

    [ https://issues.apache.org/jira/browse/SPARK-23201?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16338525#comment-16338525 ] 

Dongjoon Hyun commented on SPARK-23201:
---------------------------------------

Hi, [~joha0123].

It seems to work in the latest Apache Spark (2.2.1). Do you really want to report at *1.6.0*?

{code}
scala> sql("create view v1 as select tmp.colA, tmp.col2, tmp.colB, tmp.col5 from (select * from A left join B on (A.colA = B.colB)) tmp").show

scala> sql("select * from v1").show
+----+----+----+----+
|colA|col2|colB|col5|
+----+----+----+----+
+----+----+----+----+

scala> spark.version
res7: String = 2.2.1
{code}

> Cannot create view when duplicate columns exist in subquery
> -----------------------------------------------------------
>
>                 Key: SPARK-23201
>                 URL: https://issues.apache.org/jira/browse/SPARK-23201
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Johannes Mayer
>            Priority: Critical
>
> I have two tables A(colA, col2, col3), B(colB, col3, col5)
> If i join them in a subquery on A.colA = B.colB i can select the non duplicate columns, but i cannot create a view (col3 is duplicate, but not selected)
>  
> {code:java}
> create view testview as select
> tmp.colA, tmp.col2, tmp.colB, tmp.col5
> from (
> select * from A left join B
> on (A.colA = B.colB)
> )
> {code}
>  
>  
> This works:
>  
> {code:java}
> select
> tmp.colA, tmp.col2, tmp.colB, tmp.col5
> from (
> select * from A left join B
> on (A.colA = B.colB)
> )
> {code}
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org