You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@singa.apache.org by "RUAN PINGCHENG (JIRA)" <ji...@apache.org> on 2016/11/25 06:06:58 UTC
[jira] [Created] (SINGA-275) Add Cross Entropy Loss for multiple
labels
RUAN PINGCHENG created SINGA-275:
------------------------------------
Summary: Add Cross Entropy Loss for multiple labels
Key: SINGA-275
URL: https://issues.apache.org/jira/browse/SINGA-275
Project: Singa
Issue Type: New Feature
Reporter: RUAN PINGCHENG
Priority: Critical
A cross entropy loss function can consider for multiple labels. Normalize the label vector during training and testing, e.g, <1, 0, 0, 1> -> <0.5, 0, 0, 0.5>.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)