You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2022/08/17 08:34:20 UTC

[spark] branch master updated: [SPARK-40066][SQL][FOLLOW-UP] Check if ElementAt is resolved before getting its dataType

This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new fbc0edac585 [SPARK-40066][SQL][FOLLOW-UP] Check if ElementAt is resolved before getting its dataType
fbc0edac585 is described below

commit fbc0edac5859dae6b2c9ad012d3932f54196f2e6
Author: Hyukjin Kwon <gu...@apache.org>
AuthorDate: Wed Aug 17 17:34:02 2022 +0900

    [SPARK-40066][SQL][FOLLOW-UP] Check if ElementAt is resolved before getting its dataType
    
    ### What changes were proposed in this pull request?
    
    This PR is a followup of https://github.com/apache/spark/pull/37503 that adds a check if the `ElementAt` expression is resolved or not before getting its dataType.
    
    ### Why are the changes needed?
    
    To make the tests pass with ANSI enabled. Currently it fails (https://github.com/apache/spark/runs/7870131749?check_suite_focus=true) as below:
    
    ```
    [info] - map_filter *** FAILED *** (243 milliseconds)
    [info]  org.apache.spark.sql.catalyst.analysis.UnresolvedException: Invalid call to dataType on unresolved object
    [info]  at org.apache.spark.sql.catalyst.expressions.UnresolvedNamedLambdaVariable.dataType(higherOrderFunctions.scala:46)
    [info]  at org.apache.spark.sql.catalyst.expressions.ElementAt.initQueryContext(collectionOperations.scala:2275)
    [info]  at org.apache.spark.sql.catalyst.expressions.SupportQueryContext.$init$(Expression.scala:603)
    [info]  at org.apache.spark.sql.catalyst.expressions.ElementAt.<init>(collectionOperations.scala:2105)
    [info]  at org.apache.spark.sql.functions$.element_at(functions.scala:3958)
    [info]  at org.apache.spark.sql.DataFrameFunctionsSuite.$anonfun$new$452(DataFrameFunctionsSuite.scala:2476)
    [info]  at org.apache.spark.sql.functions$.createLambda(functions.scala:4029)
    [info]  at org.apache.spark.sql.functions$.map_filter(functions.scala:4256)
    [info]  at org.apache.spark.sql.DataFrameFunctionsSuite.$anonfun$new$451(DataFrameFunctionsSuite.scala:2476)
    [info]  at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:133)
    [info]  at org.apache.spark.sql.DataFrameFunctionsSuite.$anonfun$new$445(DataFrameFunctionsSuite.scala:2478)
    [info]  at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
    [info]  at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
    [info]  at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    [info]  at org.scalatest.Transformer.apply(Transformer.scala:22)
    [info]  at org.scalatest.Transformer.apply(Transformer.scala:20)
    [info]  at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:190)
    [info]  at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:204)
    [info]  at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:188)
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, test-only.
    
    ### How was this patch tested?
    
    Manually tested with ANSI mode enabled.
    
    Closes #37548 from HyukjinKwon/SPARK-40066.
    
    Authored-by: Hyukjin Kwon <gu...@apache.org>
    Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
 .../apache/spark/sql/catalyst/expressions/collectionOperations.scala    | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
index 50da0fb12ec..3090916582e 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
@@ -2272,7 +2272,7 @@ case class ElementAt(
     newLeft: Expression, newRight: Expression): ElementAt = copy(left = newLeft, right = newRight)
 
   override def initQueryContext(): Option[SQLQueryContext] = {
-    if (failOnError && left.dataType.isInstanceOf[ArrayType]) {
+    if (failOnError && left.resolved && left.dataType.isInstanceOf[ArrayType]) {
       Some(origin.context)
     } else {
       None


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org