You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@lucene.apache.org by "Christine Poerschke (JIRA)" <ji...@apache.org> on 2018/10/11 17:16:00 UTC
[jira] [Commented] (SOLR-12780) Add support for Leaky ReLU and TanH
activations in LTR contrib module
[ https://issues.apache.org/jira/browse/SOLR-12780?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16646782#comment-16646782 ]
Christine Poerschke commented on SOLR-12780:
--------------------------------------------
Thanks [~kamulau] for opening this ticket with a patch to add two additional activation functions!
Just attached slightly revised patch, the main difference is the (proposed) use of {{Math.tanh}} instead of the
{code}
(Math.exp(in) - Math.exp(-in))/(Math.exp(in) + Math.exp(-in))
{code}
formula - what do you think? Initially I'd wondered about the benefits or otherwise of reducing the number of {{Math.exp}} calls and then your SOLR-12785 patch made me wonder if [Apache Commons Math|http://commons.apache.org/proper/commons-math/javadocs/api-3.6.1/index.html] has activation functions and that then led to the discovery that {{Math.tanh}} exists in Java itself!
> Add support for Leaky ReLU and TanH activations in LTR contrib module
> ---------------------------------------------------------------------
>
> Key: SOLR-12780
> URL: https://issues.apache.org/jira/browse/SOLR-12780
> Project: Solr
> Issue Type: New Feature
> Security Level: Public(Default Security Level. Issues are Public)
> Components: contrib - LTR
> Reporter: Kamuela Lau
> Priority: Minor
> Labels: ltr
> Attachments: SOLR-12780.patch, SOLR-12780.patch
>
>
> Add support for Leaky ReLU and TanH activation functions in NeuralNetworkModel.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org
For additional commands, e-mail: dev-help@lucene.apache.org