You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2021/04/14 03:03:39 UTC

[GitHub] [tvm] yuchaoli opened a new pull request #7845: Fix PyTorch matmul conversion when given (2-dim, N-dim) input pair

yuchaoli opened a new pull request #7845:
URL: https://github.com/apache/tvm/pull/7845


   This PR change the matmul conversion in PyTorch when given (2-dim, N-dim) input pair (N>2).
   
   Original implement conversion [2 dim matrix * 3 dim tensor] into [batch matmul], however it can be implemented by a simple [matmul] with [2 dim matrix * 2 dim matrix (reshape 3 dim tensor into 2 dim matrix)].
   
   cc @jcf94 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] masahi merged pull request #7845: Fix PyTorch matmul conversion when given (2-dim, N-dim) input pair

Posted by GitBox <gi...@apache.org>.
masahi merged pull request #7845:
URL: https://github.com/apache/tvm/pull/7845


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] masahi commented on pull request #7845: Fix PyTorch matmul conversion when given (2-dim, N-dim) input pair

Posted by GitBox <gi...@apache.org>.
masahi commented on pull request #7845:
URL: https://github.com/apache/tvm/pull/7845#issuecomment-820080193


   Thanks @yuchaoli @jcf94 @comaniac @wweic 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] jcf94 commented on a change in pull request #7845: Fix PyTorch matmul conversion when given (2-dim, N-dim) input pair

Posted by GitBox <gi...@apache.org>.
jcf94 commented on a change in pull request #7845:
URL: https://github.com/apache/tvm/pull/7845#discussion_r612988079



##########
File path: tests/python/frontend/pytorch/test_forward.py
##########
@@ -162,7 +162,7 @@ def measure_latency(model, input_shapes, output_shapes, thresh, dryruns=40):
                 return est
 
 
-def verify_model(model_name, input_data=[], custom_convert_map={}, rtol=1e-5, atol=1e-5):
+def verify_model(model_name, input_data=[], custom_convert_map={}, rtol=1e-5, atol=1e-5, expected_ops=[]):

Review comment:
       Good addition!

##########
File path: python/tvm/relay/frontend/pytorch.py
##########
@@ -1606,18 +1606,30 @@ def matmul(self, inputs, input_types):
             if need_reshape_output:
                 return _op.reshape(output, [*a_shape[:-2], a_shape[-2], b_shape[-1]])
             return output
+        elif len(a_shape) > 2:
+            inputs_0 = _op.reshape(inputs_0, [-1, a_shape[-1]])
 
-        # Otherwise a simple dense op will get the job done.
-        if len(b_shape) == 1:
+        if len(b_shape) > 2:
+            trans_axes = list(range(len(b_shape)))
+            trans_axes[-2], trans_axes[-1] = trans_axes[-1], trans_axes[-2]
+            input_1 = _op.reshape(_op.transpose(inputs_1, trans_axes), [-1, b_shape[-2]])
+        elif len(b_shape) == 1:
             input_1 = _op.expand_dims(inputs_1, 0, 1)
-        else:
+        elif len(b_shape) == 2:
             input_1 = _op.transpose(inputs_1, axes=(1, 0))
 
         out = _op.nn.dense(inputs_0, input_1)
 
         if len(b_shape) == 1:
             out = _op.squeeze(out, axis=[-1])
 
+        # Reshape output into a N dimensional tensor when a or b dim > 2
+        if len(a_shape) > 2:
+            out = _op.reshape(out, [*a_shape[:-1], b_shape[-1]])
+        elif len(b_shape) > 2:
+            out = _op.reshape(out, [a_shape[-2], -1, b_shape[-1]])
+            out = _op.reshape(_op.transpose(out, [1, 0, 2]), [*b_shape[:-2], a_shape[-2], b_shape[-1]])
+

Review comment:
       Based on @comaniac 's comments, I think it will be better to also merge these conditions together:
   
   if len(a_shape) > 2:
   elif len(b_shape) > 2:
   elif len(b_shape) == 1:

##########
File path: python/tvm/relay/frontend/pytorch.py
##########
@@ -1606,18 +1606,30 @@ def matmul(self, inputs, input_types):
             if need_reshape_output:
                 return _op.reshape(output, [*a_shape[:-2], a_shape[-2], b_shape[-1]])
             return output
+        elif len(a_shape) > 2:
+            inputs_0 = _op.reshape(inputs_0, [-1, a_shape[-1]])
 
-        # Otherwise a simple dense op will get the job done.
-        if len(b_shape) == 1:
+        if len(b_shape) > 2:
+            trans_axes = list(range(len(b_shape)))
+            trans_axes[-2], trans_axes[-1] = trans_axes[-1], trans_axes[-2]
+            input_1 = _op.reshape(_op.transpose(inputs_1, trans_axes), [-1, b_shape[-2]])
+        elif len(b_shape) == 1:
             input_1 = _op.expand_dims(inputs_1, 0, 1)
-        else:
+        elif len(b_shape) == 2:
             input_1 = _op.transpose(inputs_1, axes=(1, 0))

Review comment:
       Not necessary but if these order can be changed to:
   
   if len(b_shape) > 2:
   elif len(b_shape) == 2:
   elif len(b_shape) == 1:
   
   will be better.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] comaniac commented on a change in pull request #7845: Fix PyTorch matmul conversion when given (2-dim, N-dim) input pair

Posted by GitBox <gi...@apache.org>.
comaniac commented on a change in pull request #7845:
URL: https://github.com/apache/tvm/pull/7845#discussion_r612991959



##########
File path: tests/python/frontend/pytorch/test_forward.py
##########
@@ -219,6 +219,21 @@ def verify_model(model_name, input_data=[], custom_convert_map={}, rtol=1e-5, at
 
                 assert_shapes_match(baseline_output, compiled_output)
                 tvm.testing.assert_allclose(baseline_output, compiled_output, rtol=rtol, atol=atol)
+
+    if len(expected_ops) != 0:

Review comment:
       ```suggestion
       if expected_ops:
   ```
   

##########
File path: tests/python/frontend/pytorch/test_forward.py
##########
@@ -219,6 +219,21 @@ def verify_model(model_name, input_data=[], custom_convert_map={}, rtol=1e-5, at
 
                 assert_shapes_match(baseline_output, compiled_output)
                 tvm.testing.assert_allclose(baseline_output, compiled_output, rtol=rtol, atol=atol)
+
+    if len(expected_ops) != 0:
+        found_op = dict.fromkeys(expected_ops, False)
+        def visit(op):
+            if isinstance(op, tvm.ir.op.Op):
+                if op.name in expected_ops:
+                    found_op[op.name] = True
+                
+        tvm.relay.analysis.post_order_visit(mod['main'].body, visit)
+
+        for op_name, is_found in enumerate(found_op):
+            if not is_found:
+                msg = "TVM Relay do not contain expected op [{}]"
+                raise AssertionError(msg.format(op_name))

Review comment:
       Better to collect all not found ops and throw them together.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] comaniac commented on a change in pull request #7845: Fix PyTorch matmul conversion when given (2-dim, N-dim) input pair

Posted by GitBox <gi...@apache.org>.
comaniac commented on a change in pull request #7845:
URL: https://github.com/apache/tvm/pull/7845#discussion_r612916935



##########
File path: python/tvm/relay/frontend/pytorch.py
##########
@@ -1607,6 +1607,12 @@ def matmul(self, inputs, input_types):
                 return _op.reshape(output, [*a_shape[:-2], a_shape[-2], b_shape[-1]])
             return output
 
+        # Reshape a or b into a 2 dimensional tensor
+        if len(a_shape) > 2:
+            inputs_0 = _op.reshape(inputs_0, [-1, a_shape[-1]])
+        if len(b_shape) > 2:
+            inputs_1 = _op.reshape(inputs_1, [-1, b_shape[-1]])
+
         # Otherwise a simple dense op will get the job done.

Review comment:
       ```suggestion
            # A simple dense op will get the job done.
   ```
   

##########
File path: python/tvm/relay/frontend/pytorch.py
##########
@@ -1607,6 +1607,12 @@ def matmul(self, inputs, input_types):
                 return _op.reshape(output, [*a_shape[:-2], a_shape[-2], b_shape[-1]])
             return output
 
+        # Reshape a or b into a 2 dimensional tensor

Review comment:
       I feel merging them to the above if-branch could be clearer:
   
   ```
   if both a and b > 2:
     Batch matmul
   elif a > 2:
     Reshape a
   elif b > 2:
     Reshape b
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] yuchaoli commented on a change in pull request #7845: Fix PyTorch matmul conversion when given (2-dim, N-dim) input pair

Posted by GitBox <gi...@apache.org>.
yuchaoli commented on a change in pull request #7845:
URL: https://github.com/apache/tvm/pull/7845#discussion_r612988061



##########
File path: python/tvm/relay/frontend/pytorch.py
##########
@@ -1607,6 +1607,12 @@ def matmul(self, inputs, input_types):
                 return _op.reshape(output, [*a_shape[:-2], a_shape[-2], b_shape[-1]])
             return output
 
+        # Reshape a or b into a 2 dimensional tensor

Review comment:
       Done. To avoid generating redundant reshape/transpose op, I handle the dim of b independently.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] comaniac commented on pull request #7845: Fix PyTorch matmul conversion when given (2-dim, N-dim) input pair

Posted by GitBox <gi...@apache.org>.
comaniac commented on pull request #7845:
URL: https://github.com/apache/tvm/pull/7845#issuecomment-819756152


   @wweic it seems no conflict to the main branch? Please feel free to merge it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org