You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xin Wu (Jira)" <ji...@apache.org> on 2019/12/23 01:34:00 UTC

[jira] [Updated] (SPARK-30326) Raise exception if analyzer exceed max iterations

     [ https://issues.apache.org/jira/browse/SPARK-30326?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xin Wu updated SPARK-30326:
---------------------------
    Description: Currently, both analyzer and optimizer just log warning message if rule execution exceed max iterations. They should have different behavior. Analyzer should raise exception to indicates the plan is not fixed after max iterations, while optimizer just log warning to keep the current plan. This is more feasible after SPARK-30138 was introduced.  (was: Currently, both analyzer and optimizer just log warning message if rule execution exceed max iterations. They should have different behavior. Analyzer should raise exception to indicates logical plan resolve failed, while optimizer just log warning to keep the current plan. This is more feasible after SPARK-30138 was introduced.)

> Raise exception if analyzer exceed max iterations
> -------------------------------------------------
>
>                 Key: SPARK-30326
>                 URL: https://issues.apache.org/jira/browse/SPARK-30326
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Xin Wu
>            Priority: Major
>
> Currently, both analyzer and optimizer just log warning message if rule execution exceed max iterations. They should have different behavior. Analyzer should raise exception to indicates the plan is not fixed after max iterations, while optimizer just log warning to keep the current plan. This is more feasible after SPARK-30138 was introduced.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org