You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@opennlp.apache.org by "Jörn Kottmann (JIRA)" <ji...@apache.org> on 2011/06/02 13:00:47 UTC

[jira] [Commented] (OPENNLP-154) normalization in perceptron

    [ https://issues.apache.org/jira/browse/OPENNLP-154?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13042702#comment-13042702 ] 

Jörn Kottmann commented on OPENNLP-154:
---------------------------------------

Jason, is the replacement normalization you put in softmax normalization?

Like described here?
http://en.wikipedia.org/wiki/Softmax_activation_function



> normalization in perceptron
> ---------------------------
>
>                 Key: OPENNLP-154
>                 URL: https://issues.apache.org/jira/browse/OPENNLP-154
>             Project: OpenNLP
>          Issue Type: Bug
>          Components: Maxent
>    Affects Versions: maxent-3.0.1-incubating
>            Reporter: Jason Baldridge
>            Assignee: Jason Baldridge
>            Priority: Minor
>             Fix For: tools-1.5.2-incubating, maxent-3.0.2-incubating
>
>   Original Estimate: 0h
>  Remaining Estimate: 0h
>
> I found some issues with the way perceptron output was normalized. It was sort of a strange way to handle negative numbers that didn't really work.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira