You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/12/21 17:05:32 UTC

[GitHub] [spark] allisonwang-db commented on a diff in pull request #39133: [SPARK-41595][SQL] Support generator function explode/explode_outer in the FROM clause

allisonwang-db commented on code in PR #39133:
URL: https://github.com/apache/spark/pull/39133#discussion_r1054612098


##########
sql/core/src/test/resources/sql-tests/inputs/table-valued-functions.sql:
##########
@@ -27,3 +27,33 @@ select * from range(0, 5, 0);
 
 -- range call with a mixed-case function name
 select * from RaNgE(2);
+
+-- explode
+select * from explode(array(1, 2));
+select * from explode(map('a', 1, 'b', 2));
+
+-- explode with empty values
+select * from explode(array());
+select * from explode(map());
+
+-- explode with column aliases
+select * from explode(array(1, 2)) t(c1);
+select * from explode(map('a', 1, 'b', 2)) t(k, v);
+
+-- explode with erroneous input

Review Comment:
   I will create another PR to support explode with LATERAL references. 



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -2251,7 +2251,7 @@ class Analyzer(override val catalogManager: CatalogManager)
                 messageParameters = Map("name" -> u.name.quoted))
           }
           // If alias names assigned, add `Project` with the aliases
-          if (u.outputNames.nonEmpty) {
+          if (resolvedFunc.resolved && u.outputNames.nonEmpty) {

Review Comment:
   The input arguments of the function may have incompatible data types. I will update this to make it more clear.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org