You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2020/11/10 17:20:43 UTC

[GitHub] [incubator-tvm] tkonolige commented on a change in pull request #6794: [TOPI] Add embedding op and gradient

tkonolige commented on a change in pull request #6794:
URL: https://github.com/apache/incubator-tvm/pull/6794#discussion_r520735630



##########
File path: python/tvm/topi/x86/nn.py
##########
@@ -69,3 +69,33 @@ def schedule_softmax(outs):
         s[exp].compute_at(s[softmax], fused_outer_axes)
 
     return s
+
+
+def schedule_embed_grad(outs):
+    """Schedule for embed_grad
+
+    Parameters
+    ----------
+    outs: Array of Tensor
+          The computation graph description of embed_grad
+          in the format of an array of tensors.
+
+    Returns
+    -------
+    sch: Schedule
+        The computation schedule for the op.
+    """
+    s = te.create_schedule([outs[0].op])
+
+    vec_size = 8  # should autotune this, but we can't with hybrid script

Review comment:
       We could reuse it. I just didn't have a good way to figure out the width of the vector instructions.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org