You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@singa.apache.org by GitBox <gi...@apache.org> on 2019/08/14 09:19:03 UTC

[GitHub] [incubator-singa] nudles commented on a change in pull request #482: SINGA-474 hardsigmoid operator

nudles commented on a change in pull request #482: SINGA-474 hardsigmoid operator
URL: https://github.com/apache/incubator-singa/pull/482#discussion_r313780583
 
 

 ##########
 File path: python/singa/autograd.py
 ##########
 @@ -358,6 +358,51 @@ def relu(x):
     return ReLU()(x)[0]
 
 
+class HardSigmoid(Operation):
+    def __init__(self,alpha=0.2,gamma=0.5):
+        super(HardSigmoid, self).__init__()
+        self.alpha=alpha
+        self.gamma=gamma
+
+    def forward(self, x):
+        """Do forward propgation.
+        #y = max(0, min(1, alpha * x + gamma))
+        Args:
+            x (CTensor): matrix
+        Returns:
+            a CTensor for the result
+        """
+        if training:
+            self.input = x
+
+        x = singa.AddFloat(singa.MultFloat(x,self.alpha),self.gamma)
+        x = singa.ReLU(x)
+        mask = singa.LTFloat(x, 1.0)
+        mask2 = singa.GEFloat(x, 1.0)
+
+        ans = singa.__add__(singa.__mul__(x, mask),mask2)
 
 Review comment:
   relu will turn x into x if x>=0 and 0 otherwise;
   then, you do not need to add 1? 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services