You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yuming Wang (JIRA)" <ji...@apache.org> on 2019/07/10 14:25:00 UTC

[jira] [Commented] (SPARK-28327) Spark SQL can't support union with left query have queryOrganization

    [ https://issues.apache.org/jira/browse/SPARK-28327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16882109#comment-16882109 ] 

Yuming Wang commented on SPARK-28327:
-------------------------------------

PostgreSQL also does not support this:
{code:sql}
postgres=# create or replace temporary view t1 as select * from (values(1), (2), (null), (3), (null)) as v (val);
CREATE VIEW
postgres=# SELECT val FROM t1 LIMIT 1 union all SELECT val FROM t1 LIMIT 2;
ERROR:  syntax error at or near "union"
LINE 1: SELECT val FROM t1 LIMIT 1 union all SELECT val FROM t1 LIMI...
                                   ^
postgres=# (SELECT val FROM t1 LIMIT 1) union all (SELECT val FROM t1 LIMIT 2);
 val
-----
   1
   1
   2
(3 rows)
{code}


Could you add parentheses?
{code:sql}
spark-sql> create or replace temporary view t1 as select * from (values(1), (2), (null), (3), (null)) as v (val);
spark-sql> (SELECT val FROM t1 LIMIT 1) union all (SELECT val FROM t1 LIMIT 2);
1
1
2
{code}


> Spark SQL can't support union with left query  have queryOrganization
> ---------------------------------------------------------------------
>
>                 Key: SPARK-28327
>                 URL: https://issues.apache.org/jira/browse/SPARK-28327
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.4.0, 3.0.0
>            Reporter: angerszhu
>            Priority: Major
>
> Spark SQL can't support SQL likeĀ 
> {code:java}
> SELECT A FROM TABLE_1 LIMIT 1
> UNION 
> SELECT A FROM TABLE_2 LIMIT 2{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org