You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2022/08/22 20:34:34 UTC

[GitHub] [tvm] shingjan commented on a diff in pull request #12485: [Relay][Op] embedding_bag operator implementation

shingjan commented on code in PR #12485:
URL: https://github.com/apache/tvm/pull/12485#discussion_r951886852


##########
python/tvm/relay/frontend/pytorch.py:
##########
@@ -2242,6 +2243,29 @@ def embedding(self, inputs, input_types):
 
         return _op.take(weight, indices.astype("int32"), axis=0)
 
+    def embedding_bag(self, inputs, _):
+        assert len(inputs) == 9, "embedding_bag needs 9 arguments"
+        (
+            weights,
+            indices,
+            offsets_1d,
+            scale_grad_by_freq,
+            mode,
+            sparse,
+            per_sample_weights,
+            include_last_offset,

Review Comment:
   is it possible that we can support `sparse`, `per_sample_weights` and `include_last_offset` as well? I think at least for model `dlrm`, `sparce` is needed. If we can't really figure out a way to support `scale_grad_by_freq`, for now my take is that we can explicitly put a check here and fail the compilation if this argument is passed from pytorch.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org