You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@singa.apache.org by "ASF subversion and git services (JIRA)" <ji...@apache.org> on 2016/02/24 10:26:18 UTC

[jira] [Commented] (SINGA-145) New SGD based optimization Updaters: AdaDelta, Adam, AdamMax

    [ https://issues.apache.org/jira/browse/SINGA-145?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15162722#comment-15162722 ] 

ASF subversion and git services commented on SINGA-145:
-------------------------------------------------------

Commit 68140079edc5a1c35676a4553e8cc1498ad8784a in incubator-singa's branch refs/heads/master from ijingo
[ https://git-wip-us.apache.org/repos/asf?p=incubator-singa.git;h=6814007 ]

SINGA-145 New SGD based optimization Updaters: AdaDelta, Adam, AdamMax

exclude unnecessary low level optimization


> New SGD based optimization Updaters: AdaDelta, Adam, AdamMax
> ------------------------------------------------------------
>
>                 Key: SINGA-145
>                 URL: https://issues.apache.org/jira/browse/SINGA-145
>             Project: Singa
>          Issue Type: New Feature
>         Environment: Universal
>            Reporter: Wang Ji
>            Priority: Minor
>
> This ticket implements three Stochastic Gradient Descent (SGD) based optimization algorithms in Updater.cc, i.e. AdaDelta, Adaptive Moment Estimation(Adam) with its variant AdamMax .These algorithms adapt the learning rate to the parameters, performing larger updates for infrequent and smaller updates for frequent parameters.
> For algorithm details, refer to AdaDelata (http://arxiv.org/abs/1212.5701), Adam (http://arxiv.org/pdf/1412.6980.pdf).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)