You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by we...@apache.org on 2020/07/07 13:41:31 UTC
[spark] branch master updated: [SPARK-31975][SQL] Show
AnalysisException when WindowFunction is used without WindowExpression
This is an automated email from the ASF dual-hosted git repository.
wenchen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 2e23da2 [SPARK-31975][SQL] Show AnalysisException when WindowFunction is used without WindowExpression
2e23da2 is described below
commit 2e23da2bda47dcfa8d143d317aff56860b77fe80
Author: ulysses <yo...@weidian.com>
AuthorDate: Tue Jul 7 13:39:04 2020 +0000
[SPARK-31975][SQL] Show AnalysisException when WindowFunction is used without WindowExpression
### What changes were proposed in this pull request?
Add WindowFunction check at `CheckAnalysis`.
### Why are the changes needed?
Provide friendly error msg.
**BEFORE**
```scala
scala> sql("select rank() from values(1)").show
java.lang.UnsupportedOperationException: Cannot generate code for expression: rank()
```
**AFTER**
```scala
scala> sql("select rank() from values(1)").show
org.apache.spark.sql.AnalysisException: Window function rank() requires an OVER clause.;;
Project [rank() AS RANK()#3]
+- LocalRelation [col1#2]
```
### Does this PR introduce _any_ user-facing change?
Yes, user wiill be given a better error msg.
### How was this patch tested?
Pass the newly added UT.
Closes #28808 from ulysses-you/SPARK-31975.
Authored-by: ulysses <yo...@weidian.com>
Signed-off-by: Wenchen Fan <we...@databricks.com>
---
.../apache/spark/sql/catalyst/analysis/CheckAnalysis.scala | 5 +++++
.../apache/spark/sql/catalyst/analysis/AnalysisSuite.scala | 11 +++++++++++
2 files changed, 16 insertions(+)
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala
index 9c99aca..43dd097 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala
@@ -158,6 +158,11 @@ trait CheckAnalysis extends PredicateHelper {
case g: GroupingID =>
failAnalysis("grouping_id() can only be used with GroupingSets/Cube/Rollup")
+ case e: Expression if e.children.exists(_.isInstanceOf[WindowFunction]) &&
+ !e.isInstanceOf[WindowExpression] =>
+ val w = e.children.find(_.isInstanceOf[WindowFunction]).get
+ failAnalysis(s"Window function $w requires an OVER clause.")
+
case w @ WindowExpression(AggregateExpression(_, _, true, _, _), _) =>
failAnalysis(s"Distinct window functions are not supported: $w")
diff --git a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnalysisSuite.scala b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnalysisSuite.scala
index c15ec49..c0be49a 100644
--- a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnalysisSuite.scala
+++ b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnalysisSuite.scala
@@ -884,4 +884,15 @@ class AnalysisSuite extends AnalysisTest with Matchers {
Seq("Intersect can only be performed on tables with the compatible column types. " +
"timestamp <> double at the second column of the second table"))
}
+
+ test("SPARK-31975: Throw user facing error when use WindowFunction directly") {
+ assertAnalysisError(testRelation2.select(RowNumber()),
+ Seq("Window function row_number() requires an OVER clause."))
+
+ assertAnalysisError(testRelation2.select(Sum(RowNumber())),
+ Seq("Window function row_number() requires an OVER clause."))
+
+ assertAnalysisError(testRelation2.select(RowNumber() + 1),
+ Seq("Window function row_number() requires an OVER clause."))
+ }
}
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org