You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@singa.apache.org by "ASF subversion and git services (JIRA)" <ji...@apache.org> on 2015/12/28 06:16:49 UTC

[jira] [Commented] (SINGA-116) Fix a bug in InnerProductLayer caused by weight matrix sharing

    [ https://issues.apache.org/jira/browse/SINGA-116?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15072429#comment-15072429 ] 

ASF subversion and git services commented on SINGA-116:
-------------------------------------------------------

Commit 0cb6cc222d00c0759dec41c18c80d9cb8b6befae in incubator-singa's branch refs/heads/master from [~flytosky]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-singa.git;h=0cb6cc2 ]

SINGA-116 Fix a bug in InnerProductLayer caused by weight matrix sharing

Fix a bug in reporting loss of rbm layers.
The loss was averaged over batchsize * batchsize, which should be over batchsize.
Updated the counter from counter_ += batchszie to counter_ += 1.


> Fix a bug in InnerProductLayer caused by weight matrix sharing
> --------------------------------------------------------------
>
>                 Key: SINGA-116
>                 URL: https://issues.apache.org/jira/browse/SINGA-116
>             Project: Singa
>          Issue Type: Bug
>            Reporter: wangwei
>            Assignee: wangwei
>
> There is a bug in the implementation of InnerProductLayer, which appears when training the auto-encoder example.
> The weight matrix in inner-product layer in the encoder is shared by another layer in the decoder. But the decoder uses transposed version.
> The current code does not handle matrix transposition.
> The bug can be fixed considering the transpose_ field explicitly.
> We may later handle it implicitly using a Tensor class that handles transposition automatically.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)