You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@singa.apache.org by "Zheng Kaiping (JIRA)" <ji...@apache.org> on 2016/01/27 14:42:39 UTC
[jira] [Created] (SINGA-142) A bug in BPTTWorker::Backward()
function with layers partially unrolled in neuralnet
Zheng Kaiping created SINGA-142:
-----------------------------------
Summary: A bug in BPTTWorker::Backward() function with layers partially unrolled in neuralnet
Key: SINGA-142
URL: https://issues.apache.org/jira/browse/SINGA-142
Project: Singa
Issue Type: Bug
Reporter: Zheng Kaiping
This is in SINGA_HOME/src/work.cc, in “void BPTTWorker::Backward (int step, NeuralNet* net) {}” function. It affects training in neuralnets with partially unrolled layers.
To be specific, there is no need to consider non-unrolled layers (i.e, without parameter sharing) when aggregating gradients for parameters.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)