You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marco Gaido (JIRA)" <ji...@apache.org> on 2018/12/05 11:39:00 UTC

[jira] [Commented] (SPARK-26270) Having clause does not work with explode anymore

    [ https://issues.apache.org/jira/browse/SPARK-26270?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16709953#comment-16709953 ] 

Marco Gaido commented on SPARK-26270:
-------------------------------------

This is caused by SPARK-25708. You can find more details on that ticket. If you want to switch to the previous behavior Spark had in this case you can set {{spark.sql.legacy.parser.havingWithoutGroupByAsWhere}} as {{true}}. This query, anyway, doesn't work in Postgres either, so I don't think it should be "fixed".

Since there is already a config which fits your needs, I am closing this ticket. Please feel free to re-open if you think some further action is required instead. Thanks.

> Having clause does not work with explode anymore
> ------------------------------------------------
>
>                 Key: SPARK-26270
>                 URL: https://issues.apache.org/jira/browse/SPARK-26270
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Olli Kuonanoja
>            Priority: Major
>
> Hi,
> In Spark 2.3.0 it was possible to execute queries like
> {code:sql}
> select explode(col1) as v from values array(1,2) having v>1
> {code}
> but in 2.4.0 it leads to 
> {noformat}
> org.apache.spark.sql.AnalysisException: Generators are not supported outside the SELECT clause, but got: 'Aggregate [explode(col1#1) AS v#0];
> {noformat}
> Before looking into a fix I'm trying to understand if this has been changed on purpose and if there is an alternate construct available. Could not find any pre-existing tests for the explode-having combination.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org