You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (Jira)" <ji...@apache.org> on 2020/01/08 00:57:00 UTC

[jira] [Commented] (SPARK-26249) Extension Points Enhancements to inject a rule in order and to add a batch

    [ https://issues.apache.org/jira/browse/SPARK-26249?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17010223#comment-17010223 ] 

Takeshi Yamamuro commented on SPARK-26249:
------------------------------------------

I'll close this because the corresponding pr is inactive (automatically closed). If necessary, please reopen this. Thanks.

> Extension Points Enhancements to inject a rule in order and to add a batch
> --------------------------------------------------------------------------
>
>                 Key: SPARK-26249
>                 URL: https://issues.apache.org/jira/browse/SPARK-26249
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Sunitha Kambhampati
>            Priority: Major
>
> +Motivation:+  
> Spark has extension points API to allow third parties to extend Spark with custom optimization rules. The current API does not allow fine grain control on when the optimization rule will be exercised. In the current API,  there is no way to add a batch to the optimization using the SparkSessionExtensions API, similar to the postHocOptimizationBatches in SparkOptimizer.
> In our use cases, we have optimization rules that we want to add as extensions to a batch in a specific order.
> +Proposal:+ 
> Add 2 new API's to the existing Extension Points to allow for more flexibility for third party users of Spark. 
>  # Inject a optimizer rule to a batch in order 
>  # Inject a optimizer batch in order
> The design spec is here:
> [https://drive.google.com/file/d/1m7rQZ9OZFl0MH5KS12CiIg3upLJSYfsA/view?usp=sharing]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org