You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Denis Tarima (Jira)" <ji...@apache.org> on 2021/12/20 12:53:00 UTC

[jira] [Created] (SPARK-37696) Optimizer exceeds max iterations

Denis Tarima created SPARK-37696:
------------------------------------

             Summary: Optimizer exceeds max iterations
                 Key: SPARK-37696
                 URL: https://issues.apache.org/jira/browse/SPARK-37696
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.2.0
            Reporter: Denis Tarima


A specific scenario causing Spark's failure in tests and a warning in production:

21/12/20 06:45:24 WARN BaseSessionStateBuilder$$anon$2: Max iterations (100) reached for batch Operator Optimization before Inferring Filters, please set 'spark.sql.optimizer.maxIterations' to a larger value.
21/12/20 06:45:24 WARN BaseSessionStateBuilder$$anon$2: Max iterations (100) reached for batch Operator Optimization after Inferring Filters, please set 'spark.sql.optimizer.maxIterations' to a larger value.

 

To reproduce run the following commands in `spark-shell`:

{{// define case class for a struct type in an array}}

{{case class S(v: Int, v2: Int)}}

{{// prepare a table with an array of structs}}
{{Seq((10, Seq(S(1, 2)))).toDF("i", "data").write.saveAsTable("tbl")}}

{{{}// select using SQL and join with a dataset using "left_anti"{}}}{{{}{}}}

{{{}{}}}{{{}spark.sql("select i, data[size(data) - 1].v from tbl").join(Seq(10).toDF("i"), Seq("i"), "left_anti").show(){}}}

 

The following conditions are required:
 # Having additional `v2` field in `S`
 # Using `{{{}data[size(data) - 1]{}}}` instead of `{{{}element_at(data, -1){}}}`
 # Using `{{{}left_anti{}}}` in join operation

 

The same behavior was observed in `master` branch and `3.1.1`.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org