You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2022/06/28 19:33:29 UTC

[GitHub] [tvm] shingjan opened a new pull request, #11935: [PyTorch][Relay] Add aten::cross_entropy_loss

shingjan opened a new pull request, #11935:
URL: https://github.com/apache/tvm/pull/11935

   This PR intends to add `aten::cross_entropy_loss` for the pytorch frontend. This is related to previous correctness issue of `cross_entropy_loss_with_logits` [here](https://github.com/apache/tvm/issues/9109).
   
   cc: @masahi 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] masahi commented on a diff in pull request #11935: [PyTorch][Relay] Add aten::cross_entropy_loss

Posted by GitBox <gi...@apache.org>.
masahi commented on code in PR #11935:
URL: https://github.com/apache/tvm/pull/11935#discussion_r909197431


##########
python/tvm/relay/frontend/pytorch.py:
##########
@@ -867,6 +867,34 @@ def log_sigmoid(self, inputs, input_types):
         data = inputs[0]
         return _op.log(_op.tensor.sigmoid(data))
 
+    def cross_entropy_loss_with_logits(self, inputs, input_types):
+        input = inputs[0]
+        target = inputs[1]
+        weights = inputs[2]
+        reduction = inputs[3]
+        ignore_index = inputs[4]
+        label_smoothing = inputs[5]
+        input_shape = self.infer_shape(input)
+        target_shape = self.infer_shape(target)
+        if input_shape != target_shape:
+            if reduction == 0:
+                reduction = "none"
+            elif reduction == 1:
+                reduction = "mean"
+            else:
+                reduction = "sum"
+            num_class = self.infer_shape(input)[1]
+            if weights is None:
+                weights = _op.full(_expr.const(1), (num_class,), dtype=input_types[0])
+            return _op.nn.nll_loss(
+                _op.nn.log_softmax(input), target, weights, reduction, ignore_index
+            )
+        assert reduction == 1, "reduction not supported in cross_entropy_loss"
+        assert ignore_index == -100, "reduce not supported in cross_entropy_loss"

Review Comment:
   the typo is back



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] shingjan commented on pull request #11935: [PyTorch][Relay] Add aten::cross_entropy_loss

Posted by GitBox <gi...@apache.org>.
shingjan commented on PR #11935:
URL: https://github.com/apache/tvm/pull/11935#issuecomment-1169359073

   @tvm-bot rerun


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] masahi commented on a diff in pull request #11935: [PyTorch][Relay] Add aten::cross_entropy_loss

Posted by GitBox <gi...@apache.org>.
masahi commented on code in PR #11935:
URL: https://github.com/apache/tvm/pull/11935#discussion_r908883813


##########
python/tvm/relay/frontend/pytorch.py:
##########
@@ -867,6 +867,23 @@ def log_sigmoid(self, inputs, input_types):
         data = inputs[0]
         return _op.log(_op.tensor.sigmoid(data))
 
+    def cross_entropy_loss(self, inputs, input_types):
+        input = inputs[0]
+        target = inputs[1]
+        weight = inputs[2]
+        reduction = inputs[3]
+        ignore_index = inputs[4]
+        label_smoothing = inputs[5]
+        assert weight is None, "weight not supported in cross_entropy_loss"
+        assert reduction == 1, "reduction not supported in cross_entropy_loss"
+        assert ignore_index == -100, "reduce not supported in cross_entropy_loss"

Review Comment:
   typo, reduce -> ignore_index



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] masahi merged pull request #11935: [PyTorch][Relay] Add aten::cross_entropy_loss

Posted by GitBox <gi...@apache.org>.
masahi merged PR #11935:
URL: https://github.com/apache/tvm/pull/11935


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] masahi commented on a diff in pull request #11935: [PyTorch][Relay] Add aten::cross_entropy_loss

Posted by GitBox <gi...@apache.org>.
masahi commented on code in PR #11935:
URL: https://github.com/apache/tvm/pull/11935#discussion_r908883813


##########
python/tvm/relay/frontend/pytorch.py:
##########
@@ -867,6 +867,23 @@ def log_sigmoid(self, inputs, input_types):
         data = inputs[0]
         return _op.log(_op.tensor.sigmoid(data))
 
+    def cross_entropy_loss(self, inputs, input_types):
+        input = inputs[0]
+        target = inputs[1]
+        weight = inputs[2]
+        reduction = inputs[3]
+        ignore_index = inputs[4]
+        label_smoothing = inputs[5]
+        assert weight is None, "weight not supported in cross_entropy_loss"
+        assert reduction == 1, "reduction not supported in cross_entropy_loss"
+        assert ignore_index == -100, "reduce not supported in cross_entropy_loss"

Review Comment:
   typo



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] shingjan commented on a diff in pull request #11935: [PyTorch][Relay] Add aten::cross_entropy_loss

Posted by GitBox <gi...@apache.org>.
shingjan commented on code in PR #11935:
URL: https://github.com/apache/tvm/pull/11935#discussion_r909240719


##########
python/tvm/relay/frontend/pytorch.py:
##########
@@ -867,6 +867,34 @@ def log_sigmoid(self, inputs, input_types):
         data = inputs[0]
         return _op.log(_op.tensor.sigmoid(data))
 
+    def cross_entropy_loss_with_logits(self, inputs, input_types):
+        input = inputs[0]
+        target = inputs[1]
+        weights = inputs[2]
+        reduction = inputs[3]
+        ignore_index = inputs[4]
+        label_smoothing = inputs[5]
+        input_shape = self.infer_shape(input)
+        target_shape = self.infer_shape(target)
+        if input_shape != target_shape:
+            if reduction == 0:
+                reduction = "none"
+            elif reduction == 1:
+                reduction = "mean"
+            else:
+                reduction = "sum"
+            num_class = self.infer_shape(input)[1]
+            if weights is None:
+                weights = _op.full(_expr.const(1), (num_class,), dtype=input_types[0])
+            return _op.nn.nll_loss(
+                _op.nn.log_softmax(input), target, weights, reduction, ignore_index
+            )
+        assert reduction == 1, "reduction not supported in cross_entropy_loss"
+        assert ignore_index == -100, "reduce not supported in cross_entropy_loss"

Review Comment:
   LOL this is fixed and rebased.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] shingjan commented on pull request #11935: [PyTorch][Relay] Add aten::cross_entropy_loss

Posted by GitBox <gi...@apache.org>.
shingjan commented on PR #11935:
URL: https://github.com/apache/tvm/pull/11935#issuecomment-1169358644

   @tvm-bot re-run


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] masahi commented on pull request #11935: [PyTorch][Relay] Add aten::cross_entropy_loss

Posted by GitBox <gi...@apache.org>.
masahi commented on PR #11935:
URL: https://github.com/apache/tvm/pull/11935#issuecomment-1169865805

   @tvm-bot rerun


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] shingjan commented on a diff in pull request #11935: [PyTorch][Relay] Add aten::cross_entropy_loss

Posted by GitBox <gi...@apache.org>.
shingjan commented on code in PR #11935:
URL: https://github.com/apache/tvm/pull/11935#discussion_r908924225


##########
python/tvm/relay/frontend/pytorch.py:
##########
@@ -867,6 +867,23 @@ def log_sigmoid(self, inputs, input_types):
         data = inputs[0]
         return _op.log(_op.tensor.sigmoid(data))
 
+    def cross_entropy_loss(self, inputs, input_types):
+        input = inputs[0]
+        target = inputs[1]
+        weight = inputs[2]
+        reduction = inputs[3]
+        ignore_index = inputs[4]
+        label_smoothing = inputs[5]
+        assert weight is None, "weight not supported in cross_entropy_loss"
+        assert reduction == 1, "reduction not supported in cross_entropy_loss"
+        assert ignore_index == -100, "reduce not supported in cross_entropy_loss"

Review Comment:
   fixed. Keen eyes!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org