You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:25:12 UTC

[jira] [Updated] (SPARK-11579) Method SGDOptimizer and LBFGSOptimizer in FeedForwardTrainer should not create new optimizer every time they got invoked

     [ https://issues.apache.org/jira/browse/SPARK-11579?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-11579:
---------------------------------
    Labels: bulk-closed  (was: )

> Method SGDOptimizer and LBFGSOptimizer in FeedForwardTrainer should not create new optimizer every time they got invoked
> ------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-11579
>                 URL: https://issues.apache.org/jira/browse/SPARK-11579
>             Project: Spark
>          Issue Type: Improvement
>          Components: ML
>    Affects Versions: 1.6.0
>            Reporter: yuhao yang
>            Priority: Minor
>              Labels: bulk-closed
>
> This is just a small proposal based on some customer feedback. I can send a PR if it looks reasonable.
> Currently method SGDOptimizer and LBFGSOptimizer in FeedForwardTrainer create new optimizer every time they got invoked, this is not quite intuitive since users think they are still using the existing optimizer when they write: 
>     feedForwardTrainer
>       .SGDOptimizer
>       .setMiniBatchFraction(0.002)
> yet it actually creates a new optimizer without other properties which were set previously.
> A straight-forward solution is to avoid create new optimizer when current optimizer is already of the same kind.
> if (!optimizer.instanceof[LBFGS]) 
>     optimizer = new ...



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org