You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/06/26 16:42:00 UTC

[jira] [Commented] (SPARK-21212) Can't use Count(*) with Order Clause

    [ https://issues.apache.org/jira/browse/SPARK-21212?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16063376#comment-16063376 ] 

Sean Owen commented on SPARK-21212:
-----------------------------------

You don't define 'value' anywhere, as it says.

> Can't use Count(*) with Order Clause
> ------------------------------------
>
>                 Key: SPARK-21212
>                 URL: https://issues.apache.org/jira/browse/SPARK-21212
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.1.0
>         Environment: Windows; external data provided through data source api
>            Reporter: Shawn Lavelle
>            Priority: Minor
>
> I don't think this should fail the query:
> {code}jdbc:hive2://user:port/> select count(*) from table where value between 1498240079000 and cast(now() as bigint)*1000 order by value;
> {code}
> {code}
> Error: org.apache.spark.sql.AnalysisException: cannot resolve '`value`' given input columns: [count(1)]; line 1 pos 113;
> 'Sort ['value ASC NULLS FIRST], true
> +- Aggregate [count(1) AS count(1)#718L]
>    +- Filter ((value#413L >= 1498240079000) && (value#413L <= (cast(current_timestamp() as bigint) * cast(1000 as bigint))))
>       +- SubqueryAlias table
>          +- Relation[field1#411L,field2#412,value#413L,field3#414,field4#415,field5#416,field6#417,field7#418,field8#419,field9#420] com.redacted@16004579 (state=,code=0)
> {code}
> Arguably, the optimizer could ignore the "order by" clause, but I leave that to more informed minds than my own.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org