You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/01/04 15:29:39 UTC
[jira] [Updated] (SPARK-7727) Avoid inner classes in RuleExecutor
[ https://issues.apache.org/jira/browse/SPARK-7727?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-7727:
-----------------------------
Assignee: Stephan Kessler
> Avoid inner classes in RuleExecutor
> -----------------------------------
>
> Key: SPARK-7727
> URL: https://issues.apache.org/jira/browse/SPARK-7727
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 1.3.1
> Reporter: Santiago M. Mola
> Assignee: Stephan Kessler
> Labels: easyfix, starter
> Fix For: 2.0.0
>
>
> In RuleExecutor, the following classes and objects are defined as inner classes or objects: Strategy, Once, FixedPoint, Batch.
> This does not seem to accomplish anything in this case, but makes extensibility harder. For example, if I want to define a new Optimizer that uses all batches from the DefaultOptimizer plus some more, I would do something like:
> {code}
> new Optimizer {
> override protected val batches: Seq[Batch] =
> DefaultOptimizer.batches ++ myBatches
> }
> {code}
> But this will give a typing error because batches in DefaultOptimizer are of type DefaultOptimizer#Batch while myBatches are this#Batch.
> Workarounds include either copying the list of batches from DefaultOptimizer or using a method like this:
> {code}
> private def transformBatchType(b: DefaultOptimizer.Batch): Batch = {
> val strategy = b.strategy.maxIterations match {
> case 1 => Once
> case n => FixedPoint(n)
> }
> Batch(b.name, strategy, b.rules)
> }
> {code}
> However, making these classes outer would solve the problem.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org