You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/02/10 15:41:48 UTC

[GitHub] [spark] cloud-fan commented on issue #26977: [SPARK-30326][SQL] Raise exception if analyzer exceed max iterations

cloud-fan commented on issue #26977: [SPARK-30326][SQL] Raise exception if analyzer exceed max iterations
URL: https://github.com/apache/spark/pull/26977#issuecomment-584184628
 
 
   This needs to be in 3.0. In Spark 2.4 the analyzer max iteration was controlled by `SQLConf.OPTIMIZER_MAX_ITERATIONS`. In 3.0, we add a new config for it, and it's possible that existing queries set `SQLConf.OPTIMIZER_MAX_ITERATIONS` to a large value and they fail analysis after upgrading to Spark 3.0.
   
   With this PR, they can get a clear error message to set the new config.
   
   Thanks, merging to master/3.0!

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org