You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/01/19 14:28:26 UTC

[jira] [Assigned] (SPARK-19290) add a new extending interface in Analyzer for post-hoc resolution

     [ https://issues.apache.org/jira/browse/SPARK-19290?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-19290:
------------------------------------

    Assignee: Apache Spark  (was: Wenchen Fan)

> add a new extending interface in Analyzer for post-hoc resolution
> -----------------------------------------------------------------
>
>                 Key: SPARK-19290
>                 URL: https://issues.apache.org/jira/browse/SPARK-19290
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Wenchen Fan
>            Assignee: Apache Spark
>
> To implement DDL commands, we added several analyzer rules in sql/hive module to analyze DDL related plans. However, our Analyzer currently only have one extending interface: extendedResolutionRules, which defines extra rules that will be run together with other rules in the resolution batch, and doesn't fit DDL rules well, because:
> 1. DDL rules may do some checking and normalization, but we may do it many times as the resolution batch will run rules again and again, until fixed point, and it's hard to tell if a DDL rule has already done its checking and normalization. It's fine because DDL rules are idempotent, but it's bad for analysis performance
> 2. some DDL rules may depend on others, and it's pretty hard to write if conditions to guarantee the dependencies. It will be good if we have a batch which run rules in one pass, so that we can guarantee the dependencies by rules order.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org