You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@mxnet.apache.org by "Joseph Bethge (Jira)" <ji...@apache.org> on 2019/08/22 13:36:00 UTC

[jira] [Commented] (MXNET-1420) Implement RAdam

    [ https://issues.apache.org/jira/browse/MXNET-1420?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16913311#comment-16913311 ] 

Joseph Bethge commented on MXNET-1420:
--------------------------------------

I am interested in this and might just implement this myself. However I am just now familiarizing myself with both the mxnet contribution procedure and the paper, so it might take some time. If anyone wants to pick this up or collaborate, let me know.

> Implement RAdam
> ---------------
>
>                 Key: MXNET-1420
>                 URL: https://issues.apache.org/jira/browse/MXNET-1420
>             Project: Apache MXNet
>          Issue Type: New Feature
>          Components: Apache MXNet Backend, Gluon
>            Reporter: Dheeraj M
>            Priority: Major
>
> Implementation of RAdam optimizer based on the new paperĀ [ON THE VARIANCE OF THE ADAPTIVE LEARNING RATE AND BEYOND|[https://arxiv.org/pdf/1908.03265v1.pdf]]



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@mxnet.apache.org
For additional commands, e-mail: issues-help@mxnet.apache.org