You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "wangyum (via GitHub)" <gi...@apache.org> on 2023/03/09 11:35:29 UTC
[GitHub] [spark] wangyum commented on a diff in pull request #39691: [SPARK-31561][SQL] Add QUALIFY clause
wangyum commented on code in PR #39691:
URL: https://github.com/apache/spark/pull/39691#discussion_r1130865469
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -1729,6 +1729,23 @@ class Analyzer(override val catalogManager: CatalogManager)
resolveExpressionByPlanChildren(resolvedWithAgg, u, allowOuter = true)
}
+ case u @ UnresolvedQualify(cond, child) if !u.resolved && child.resolved =>
+ if (!u.containsPattern(WINDOW_EXPRESSION)) {
Review Comment:
```sql
CREATE TABLE dealer (id INT, city STRING, car_model STRING, quantity INT) using parquet;
SELECT * from (SELECT *, ROW_NUMBER() OVER(PARTITION BY city ORDER BY id) AS rn FROM dealer QUALIFY id > 0) t QUALIFY t.id > 0;
```
The analyzed logical plan:
```
== Analyzed Logical Plan ==
id: int, city: string, car_model: string, quantity: int, rn: int
Filter (id#2 > 0)
+- Project [id#2, city#3, car_model#4, quantity#5, rn#0]
+- SubqueryAlias t
+- Filter (id#2 > 0)
+- Project [id#2, city#3, car_model#4, quantity#5, rn#0]
+- Project [id#2, city#3, car_model#4, quantity#5, rn#0, rn#0]
+- Window [row_number() windowspecdefinition(city#3, id#2 ASC NULLS FIRST, specifiedwindowframe(RowFrame, unboundedpreceding$(), currentrow$())) AS rn#0], [city#3], [id#2 ASC NULLS FIRST]
+- Project [id#2, city#3, car_model#4, quantity#5]
+- SubqueryAlias spark_catalog.default.dealer
+- Relation spark_catalog.default.dealer[id#2,city#3,car_model#4,quantity#5] parquet
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org