You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@singa.apache.org by "ASF subversion and git services (JIRA)" <ji...@apache.org> on 2016/11/29 05:31:58 UTC
[jira] [Commented] (SINGA-275) Add Cross Entropy Loss for multiple
labels
[ https://issues.apache.org/jira/browse/SINGA-275?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15704267#comment-15704267 ]
ASF subversion and git services commented on SINGA-275:
-------------------------------------------------------
Commit d1110c0b7101fff6999db1dd5cccb14bf8370578 in incubator-singa's branch refs/heads/master from [~wangwei.cs]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-singa.git;h=d1110c0 ]
SINGA-275 - Add Cross Entropy Loss for multiple labels
Updated the softmax cross entorpy loss layer and the tensor functions to enable
the ground truth be an binary array for each instance;
Added unittests for cross entropy with multiple labels per instance;
For input of a batch of instances, the ground truth tensor could be either an integer array, one value per
instance, or a binary matrix one row per instance.
For a single instance input, the feature tensor is 1-d array, and the
ground truth tensor is a 1-d array (with a single integer value or a
binary array)
> Add Cross Entropy Loss for multiple labels
> ------------------------------------------
>
> Key: SINGA-275
> URL: https://issues.apache.org/jira/browse/SINGA-275
> Project: Singa
> Issue Type: New Feature
> Reporter: RUAN PINGCHENG
> Priority: Critical
>
> A cross entropy loss function can consider for multiple labels. Normalize the label vector during training and testing, e.g, <1, 0, 0, 1> -> <0.5, 0, 0, 0.5>.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)