You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "slim bouguerra (JIRA)" <ji...@apache.org> on 2017/02/01 23:31:51 UTC

[jira] [Commented] (HIVE-15632) Hive/Druid integration: Incorrect result - Limit on timestamp disappears

    [ https://issues.apache.org/jira/browse/HIVE-15632?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15849115#comment-15849115 ] 

slim bouguerra commented on HIVE-15632:
---------------------------------------

there is no limit on timeseries query. 

> Hive/Druid integration: Incorrect result - Limit on timestamp disappears
> ------------------------------------------------------------------------
>
>                 Key: HIVE-15632
>                 URL: https://issues.apache.org/jira/browse/HIVE-15632
>             Project: Hive
>          Issue Type: Bug
>          Components: Druid integration
>    Affects Versions: 2.2.0
>            Reporter: Jesus Camacho Rodriguez
>            Assignee: Jesus Camacho Rodriguez
>            Priority: Critical
>
> This can be observed with the following query:
> {code:sql}
> SELECT DISTINCT `__time`
> FROM store_sales_sold_time_subset_hive
> ORDER BY `__time` ASC
> LIMIT 10;
> {code}
> Query is translated correctly to Druid _timeseries_, but _limit_ operator disappears.
> {code}
> OK
> Plan optimized by CBO.
> Stage-0
>   Fetch Operator
>     limit:-1
>     Select Operator [SEL_1]
>       Output:["_col0"]
>       TableScan [TS_0]
>         Output:["__time"],properties:{"druid.query.json":"{\"queryType\":\"timeseries\",\"dataSource\":\"druid_tpcds_ss_sold_time_subset\",\"descending\":false,\"granularity\":\"NONE\",\"aggregations\":[],\"intervals\":[\"1900-01-01T00:00:00.000Z/3000-01-01T00:00:00.000Z\"]}","druid.query.type":"timeseries"}
> {code}
> Thus, result has more than 10 rows.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)