You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Edoardo Vacchi (JIRA)" <ji...@apache.org> on 2015/05/22 10:39:17 UTC

[jira] [Updated] (SPARK-7823) [SQL] Batch, FixedPoint, Strategy should not be inner classes of class RuleExecutor

     [ https://issues.apache.org/jira/browse/SPARK-7823?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Edoardo Vacchi updated SPARK-7823:
----------------------------------
    Summary: [SQL] Batch, FixedPoint, Strategy should not be inner classes of class RuleExecutor  (was: Batch, FixedPoint, Strategy should not be inner classes of class RuleExecutor)

> [SQL] Batch, FixedPoint, Strategy should not be inner classes of class RuleExecutor
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-7823
>                 URL: https://issues.apache.org/jira/browse/SPARK-7823
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.1, 1.4.0
>            Reporter: Edoardo Vacchi
>            Priority: Minor
>
> Batch, FixedPoint, Strategy, Once, are defined within the class RuleExecutor[TreeType]. This makes unnecessarily complicated to reuse batches of rules within custom optimizers. E.g:
> {code:java}
> object DefaultOptimizer extends Optimizer {
>   override val batches = /* batches defined here */
> }
> object MyCustomOptimizer extends Optimizer {
>   override val batches = 
>     Batch("my custom batch" ...) ::
>     DefaultOptimizer.batches
> }
> {code}
> MyCustomOptimizer won't compile, because DefaultOptimizer.batches has type "Seq[DefaultOptimizer.this.Batch]". 
> Solution: Batch, FixedPoint, etc. should be moved *outside* the RuleExecutor[T] class body, either in a companion object or right in the `rules` package.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org