You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/09/13 09:06:20 UTC
[jira] [Commented] (SPARK-17114) Adding a 'GROUP BY 1' where first
column is literal results in wrong answer
[ https://issues.apache.org/jira/browse/SPARK-17114?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15486727#comment-15486727 ]
Apache Spark commented on SPARK-17114:
--------------------------------------
User 'hvanhovell' has created a pull request for this issue:
https://github.com/apache/spark/pull/15076
> Adding a 'GROUP BY 1' where first column is literal results in wrong answer
> ---------------------------------------------------------------------------
>
> Key: SPARK-17114
> URL: https://issues.apache.org/jira/browse/SPARK-17114
> Project: Spark
> Issue Type: Bug
> Affects Versions: 1.6.2, 2.0.0
> Reporter: Josh Rosen
> Labels: correctness
>
> Consider the following example:
> {code}
> sc.parallelize(Seq(128, 256)).toDF("int_col").registerTempTable("mytable")
> // The following query should return an empty result set because the `IN` filter condition is always false for this single-row table.
> val withoutGroupBy = sqlContext.sql("""
> SELECT 'foo'
> FROM mytable
> WHERE int_col == 0
> """)
> assert(withoutGroupBy.collect().isEmpty, "original query returned wrong answer")
> // After adding a 'GROUP BY 1' the query result should still be empty because we'd be grouping an empty table:
> val withGroupBy = sqlContext.sql("""
> SELECT 'foo'
> FROM mytable
> WHERE int_col == 0
> GROUP BY 1
> """)
> assert(withGroupBy.collect().isEmpty, "adding GROUP BY resulted in wrong answer")
> {code}
> Here, this fails the second assertion by returning a single row. It appears that running {{group by 1}} where column 1 is a constant causes filter conditions to be ignored.
> Both PostgreSQL and SQLite return empty result sets for the query containing the {{GROUP BY}}.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org