You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@mxnet.apache.org by "Xingjian Shi (JIRA)" <ji...@apache.org> on 2018/03/12 17:50:00 UTC

[jira] [Updated] (MXNET-58) Add LayerNorm in MXNet

     [ https://issues.apache.org/jira/browse/MXNET-58?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xingjian Shi updated MXNET-58:
------------------------------
    Resolution: Fixed
        Status: Done  (was: To Do)

> Add LayerNorm in MXNet
> ----------------------
>
>                 Key: MXNET-58
>                 URL: https://issues.apache.org/jira/browse/MXNET-58
>             Project: Apache MXNet
>          Issue Type: New Feature
>            Reporter: Xingjian Shi
>            Priority: Major
>          Time Spent: 2.5h
>  Remaining Estimate: 0h
>
> # Directly implement layer normalization in C++. The speed and memory cost are both better than the way of stacking the broadcast/reduce OPs. Solves [#9950|https://github.com/apache/incubator-mxnet/issues/9950]
>  # Add LayerNorm in Gluon
>  # Fix the doc of InstanceNorm. In InstanceNorm, the real axis to normalize the input tensor is all axes excluding the 0th axis and the given axis.
>  # Fix the doc of BatchNorm, the inverse std instead of the var is set as the output. Should fix [#9216|https://github.com/apache/incubator-mxnet/issues/9216]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@mxnet.apache.org
For additional commands, e-mail: issues-help@mxnet.apache.org