You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2016/01/11 02:11:39 UTC

[jira] [Updated] (SPARK-12745) Limit is not supported inside Set Operation

     [ https://issues.apache.org/jira/browse/SPARK-12745?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xiao Li updated SPARK-12745:
----------------------------
    Description: 
The current SQLContext allows the following query, which is copied from a test case in SQLQuerySuite:
{code}
     checkAnswer(sql(
       """
         |select key from ((select * from testData limit 1)
         |  union all (select * from testData limit 1)) x limit 1
       """.stripMargin),
       Row(1)
     )
{code}

However, it is rejected in the Hive parser. 

  was:
The current SQLContext allow the following query, which is copied from a test case in SQLQuerySuite:
{code}
     checkAnswer(sql(
       """
         |select key from ((select * from testData limit 1)
         |  union all (select * from testData limit 1)) x limit 1
       """.stripMargin),
       Row(1)
     )
{code}

However, it is rejected in the Hive parser. 


> Limit is not supported inside Set Operation
> -------------------------------------------
>
>                 Key: SPARK-12745
>                 URL: https://issues.apache.org/jira/browse/SPARK-12745
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Xiao Li
>
> The current SQLContext allows the following query, which is copied from a test case in SQLQuerySuite:
> {code}
>      checkAnswer(sql(
>        """
>          |select key from ((select * from testData limit 1)
>          |  union all (select * from testData limit 1)) x limit 1
>        """.stripMargin),
>        Row(1)
>      )
> {code}
> However, it is rejected in the Hive parser. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org