You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2020/04/21 18:28:36 UTC

[GitHub] [incubator-tvm] siju-samuel opened a new pull request #5395: [RELAY][PYTORCH]cosh,sinh,log2,log10,log1p op support

siju-samuel opened a new pull request #5395:
URL: https://github.com/apache/incubator-tvm/pull/5395


   - cosh
   - sinh
   - log2
   - log10
   - log1p
    The ops are supported from relay and pytorch frontend.
   
   @masahi please help me to review this PR. Thanks in advance.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] cbalint13 commented on issue #5395: [RELAY][PYTORCH]cosh,sinh,log2,log10,log1p op support

Posted by GitBox <gi...@apache.org>.
cbalint13 commented on issue #5395:
URL: https://github.com/apache/incubator-tvm/pull/5395#issuecomment-617907446


   Looks good to me.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] cbalint13 commented on a change in pull request #5395: [RELAY][PYTORCH]cosh,sinh,log2,log10,log1p op support

Posted by GitBox <gi...@apache.org>.
cbalint13 commented on a change in pull request #5395:
URL: https://github.com/apache/incubator-tvm/pull/5395#discussion_r412425014



##########
File path: python/tvm/relay/op/_tensor_grad.py
##########
@@ -61,6 +61,24 @@ def log_grad(orig, grad):
     return [grad * ones_like(x) / x]
 
 
+@register_gradient("log2")
+def log2_grad(orig, grad):
+    """Returns [grad * 1 / (log(2) * x)]"""
+    x = orig.args[0]
+    ones = ones_like(x)
+    two = const(2.0)
+    return [grad * ones / (log(two) * x)]
+
+
+@register_gradient("log10")
+def log10_grad(orig, grad):
+    """Returns [grad * 1 / (log(10) * x)]"""
+    x = orig.args[0]
+    ones = ones_like(x)
+    ten = const(2.0)

Review comment:
       should be const(10.0)




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] masahi commented on issue #5395: [RELAY][PYTORCH]cosh,sinh,log2,log10,log1p op support

Posted by GitBox <gi...@apache.org>.
masahi commented on issue #5395:
URL: https://github.com/apache/incubator-tvm/pull/5395#issuecomment-618174945


   @siju-samuel Is adding gradient important? Since they are not tested, I think it is better to remove them or add grad tests.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org