You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2021/10/17 00:40:45 UTC

[GitHub] [tvm] jiangjiajun opened a new pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

jiangjiajun opened a new pull request #9295:
URL: https://github.com/apache/tvm/pull/9295


   also include `pad3d` and `squeeze`, this two operators will be used for padding tensor. 
   And the `_autopad` function refers to ONNX frontend @mbrookhart 
   
   Thanks for contributing to TVM!   Please refer to guideline https://tvm.apache.org/docs/contribute/ for useful information and tips. After the pull request is submitted, please request code reviews from [Reviewers](https://github.com/apache/incubator-tvm/blob/master/CONTRIBUTORS.md#reviewers) by @ them in the pull request thread.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] jiangjiajun commented on a change in pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

Posted by GitBox <gi...@apache.org>.
jiangjiajun commented on a change in pull request #9295:
URL: https://github.com/apache/tvm/pull/9295#discussion_r734305439



##########
File path: python/tvm/relay/frontend/paddlepaddle.py
##########
@@ -248,24 +287,13 @@ def convert_conv2d(g, op, block):
     if padding_algorithm == "VALID":
         paddings = [0, 0]
     elif padding_algorithm == "SAME":
-        if strides[0] == 1 and strides[1] == 1:
-            pad_h = _get_pad_size(0, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(0, (k_w - 1) * dilations[1] + 1, strides[1])
-        else:
-            input_shape = shape_of(input_x)
-            h_w = _op.strided_slice(input_shape, [2], [4])
-            try:
-                in_h, in_w = infer_value(h_w, g.get_params()).numpy().tolist()
-            except Exception as e:
-                msg = "Dynamic shape is not supported in SAME padding algorithm while stride!=1"
-                raise tvm.error.OpAttributeInvalid(msg) from e
-            pad_h = _get_pad_size(in_h, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(in_w, (k_w - 1) * dilations[1] + 1, strides[1])
-        paddings = [pad_h[0], pad_w[0], pad_h[1], pad_w[1]]
+        dilations = [1, 1]

Review comment:
       > do we mean to override the dilations?
   
   Yes, after `autopad` on input tensor, dilation is not need for the next step

##########
File path: python/tvm/relay/frontend/paddlepaddle.py
##########
@@ -248,24 +287,13 @@ def convert_conv2d(g, op, block):
     if padding_algorithm == "VALID":
         paddings = [0, 0]
     elif padding_algorithm == "SAME":
-        if strides[0] == 1 and strides[1] == 1:
-            pad_h = _get_pad_size(0, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(0, (k_w - 1) * dilations[1] + 1, strides[1])
-        else:
-            input_shape = shape_of(input_x)
-            h_w = _op.strided_slice(input_shape, [2], [4])
-            try:
-                in_h, in_w = infer_value(h_w, g.get_params()).numpy().tolist()
-            except Exception as e:
-                msg = "Dynamic shape is not supported in SAME padding algorithm while stride!=1"
-                raise tvm.error.OpAttributeInvalid(msg) from e
-            pad_h = _get_pad_size(in_h, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(in_w, (k_w - 1) * dilations[1] + 1, strides[1])
-        paddings = [pad_h[0], pad_w[0], pad_h[1], pad_w[1]]
+        dilations = [1, 1]

Review comment:
       This is a little difference with ONNX, here is the API reference https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/nn/Conv2D_cn.html#conv2d
   while `padding == SAME`, dilation is no need any more
   ![image](https://user-images.githubusercontent.com/19339784/138417147-a815e32b-0aec-4b9f-bcfd-cedeaa2d1959.png)
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] jiangjiajun commented on pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

Posted by GitBox <gi...@apache.org>.
jiangjiajun commented on pull request #9295:
URL: https://github.com/apache/tvm/pull/9295#issuecomment-947264592


   @areusch @comaniac @AndrewZhaoLuo @mbrookhart Hi, Could you help to review this pull request, all the tests have passed


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] jiangjiajun commented on a change in pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

Posted by GitBox <gi...@apache.org>.
jiangjiajun commented on a change in pull request #9295:
URL: https://github.com/apache/tvm/pull/9295#discussion_r734305439



##########
File path: python/tvm/relay/frontend/paddlepaddle.py
##########
@@ -248,24 +287,13 @@ def convert_conv2d(g, op, block):
     if padding_algorithm == "VALID":
         paddings = [0, 0]
     elif padding_algorithm == "SAME":
-        if strides[0] == 1 and strides[1] == 1:
-            pad_h = _get_pad_size(0, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(0, (k_w - 1) * dilations[1] + 1, strides[1])
-        else:
-            input_shape = shape_of(input_x)
-            h_w = _op.strided_slice(input_shape, [2], [4])
-            try:
-                in_h, in_w = infer_value(h_w, g.get_params()).numpy().tolist()
-            except Exception as e:
-                msg = "Dynamic shape is not supported in SAME padding algorithm while stride!=1"
-                raise tvm.error.OpAttributeInvalid(msg) from e
-            pad_h = _get_pad_size(in_h, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(in_w, (k_w - 1) * dilations[1] + 1, strides[1])
-        paddings = [pad_h[0], pad_w[0], pad_h[1], pad_w[1]]
+        dilations = [1, 1]

Review comment:
       > do we mean to override the dilations?
   
   Yes, after `autopad` on input tensor, dilation is not need for the next step




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] AndrewZhaoLuo commented on pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

Posted by GitBox <gi...@apache.org>.
AndrewZhaoLuo commented on pull request #9295:
URL: https://github.com/apache/tvm/pull/9295#issuecomment-948054315


   I'll take a look today.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] masahi merged pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

Posted by GitBox <gi...@apache.org>.
masahi merged pull request #9295:
URL: https://github.com/apache/tvm/pull/9295


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] jiangjiajun commented on a change in pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

Posted by GitBox <gi...@apache.org>.
jiangjiajun commented on a change in pull request #9295:
URL: https://github.com/apache/tvm/pull/9295#discussion_r734444151



##########
File path: python/tvm/relay/frontend/paddlepaddle.py
##########
@@ -248,24 +287,13 @@ def convert_conv2d(g, op, block):
     if padding_algorithm == "VALID":
         paddings = [0, 0]
     elif padding_algorithm == "SAME":
-        if strides[0] == 1 and strides[1] == 1:
-            pad_h = _get_pad_size(0, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(0, (k_w - 1) * dilations[1] + 1, strides[1])
-        else:
-            input_shape = shape_of(input_x)
-            h_w = _op.strided_slice(input_shape, [2], [4])
-            try:
-                in_h, in_w = infer_value(h_w, g.get_params()).numpy().tolist()
-            except Exception as e:
-                msg = "Dynamic shape is not supported in SAME padding algorithm while stride!=1"
-                raise tvm.error.OpAttributeInvalid(msg) from e
-            pad_h = _get_pad_size(in_h, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(in_w, (k_w - 1) * dilations[1] + 1, strides[1])
-        paddings = [pad_h[0], pad_w[0], pad_h[1], pad_w[1]]
+        dilations = [1, 1]

Review comment:
       This is a history issue for Paddle framework, while padding==`SAME`, it will force dliations = 1
   Here is the implementation code https://github.com/PaddlePaddle/Paddle/blob/develop/paddle/fluid/operators/conv_op.h#L113
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] AndrewZhaoLuo commented on a change in pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

Posted by GitBox <gi...@apache.org>.
AndrewZhaoLuo commented on a change in pull request #9295:
URL: https://github.com/apache/tvm/pull/9295#discussion_r734267270



##########
File path: python/tvm/relay/frontend/paddlepaddle.py
##########
@@ -43,18 +44,56 @@
 __all__ = ["from_paddle"]
 
 
-def _get_pad_size(in_size, dilated_kernel_size, stride_size):
-    """Calculate the paddings size for Conv/Pool in SAME padding mode."""
+def _autopad(

Review comment:
       Can you just use https://github.com/apache/tvm/blob/d0c6ca5cacae8dcae26e26287d6d2a270ab6127c/python/tvm/relay/frontend/onnx.py#L412-L472? 
   
   1. Refactor _autopad in the onnx.py file to `tvm/python/tvm/relay/frontend/common.py`

##########
File path: python/tvm/relay/frontend/paddlepaddle.py
##########
@@ -248,24 +287,13 @@ def convert_conv2d(g, op, block):
     if padding_algorithm == "VALID":
         paddings = [0, 0]
     elif padding_algorithm == "SAME":
-        if strides[0] == 1 and strides[1] == 1:
-            pad_h = _get_pad_size(0, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(0, (k_w - 1) * dilations[1] + 1, strides[1])
-        else:
-            input_shape = shape_of(input_x)
-            h_w = _op.strided_slice(input_shape, [2], [4])
-            try:
-                in_h, in_w = infer_value(h_w, g.get_params()).numpy().tolist()
-            except Exception as e:
-                msg = "Dynamic shape is not supported in SAME padding algorithm while stride!=1"
-                raise tvm.error.OpAttributeInvalid(msg) from e
-            pad_h = _get_pad_size(in_h, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(in_w, (k_w - 1) * dilations[1] + 1, strides[1])
-        paddings = [pad_h[0], pad_w[0], pad_h[1], pad_w[1]]
+        dilations = [1, 1]

Review comment:
       do we mean to override the dilations?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] jiangjiajun commented on a change in pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

Posted by GitBox <gi...@apache.org>.
jiangjiajun commented on a change in pull request #9295:
URL: https://github.com/apache/tvm/pull/9295#discussion_r734322553



##########
File path: python/tvm/relay/frontend/paddlepaddle.py
##########
@@ -248,24 +287,13 @@ def convert_conv2d(g, op, block):
     if padding_algorithm == "VALID":
         paddings = [0, 0]
     elif padding_algorithm == "SAME":
-        if strides[0] == 1 and strides[1] == 1:
-            pad_h = _get_pad_size(0, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(0, (k_w - 1) * dilations[1] + 1, strides[1])
-        else:
-            input_shape = shape_of(input_x)
-            h_w = _op.strided_slice(input_shape, [2], [4])
-            try:
-                in_h, in_w = infer_value(h_w, g.get_params()).numpy().tolist()
-            except Exception as e:
-                msg = "Dynamic shape is not supported in SAME padding algorithm while stride!=1"
-                raise tvm.error.OpAttributeInvalid(msg) from e
-            pad_h = _get_pad_size(in_h, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(in_w, (k_w - 1) * dilations[1] + 1, strides[1])
-        paddings = [pad_h[0], pad_w[0], pad_h[1], pad_w[1]]
+        dilations = [1, 1]

Review comment:
       This is a little difference with ONNX, here is the API reference https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/nn/Conv2D_cn.html#conv2d
   ![image](https://user-images.githubusercontent.com/19339784/138417147-a815e32b-0aec-4b9f-bcfd-cedeaa2d1959.png)
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] jiangjiajun commented on a change in pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

Posted by GitBox <gi...@apache.org>.
jiangjiajun commented on a change in pull request #9295:
URL: https://github.com/apache/tvm/pull/9295#discussion_r734322553



##########
File path: python/tvm/relay/frontend/paddlepaddle.py
##########
@@ -248,24 +287,13 @@ def convert_conv2d(g, op, block):
     if padding_algorithm == "VALID":
         paddings = [0, 0]
     elif padding_algorithm == "SAME":
-        if strides[0] == 1 and strides[1] == 1:
-            pad_h = _get_pad_size(0, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(0, (k_w - 1) * dilations[1] + 1, strides[1])
-        else:
-            input_shape = shape_of(input_x)
-            h_w = _op.strided_slice(input_shape, [2], [4])
-            try:
-                in_h, in_w = infer_value(h_w, g.get_params()).numpy().tolist()
-            except Exception as e:
-                msg = "Dynamic shape is not supported in SAME padding algorithm while stride!=1"
-                raise tvm.error.OpAttributeInvalid(msg) from e
-            pad_h = _get_pad_size(in_h, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(in_w, (k_w - 1) * dilations[1] + 1, strides[1])
-        paddings = [pad_h[0], pad_w[0], pad_h[1], pad_w[1]]
+        dilations = [1, 1]

Review comment:
       This is a little difference with ONNX, here is the API reference https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/nn/Conv2D_cn.html#conv2d
   while `padding == SAME`, dilation is no need any more
   ![image](https://user-images.githubusercontent.com/19339784/138417147-a815e32b-0aec-4b9f-bcfd-cedeaa2d1959.png)
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] jiangjiajun commented on a change in pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

Posted by GitBox <gi...@apache.org>.
jiangjiajun commented on a change in pull request #9295:
URL: https://github.com/apache/tvm/pull/9295#discussion_r734451096



##########
File path: python/tvm/relay/frontend/paddlepaddle.py
##########
@@ -43,18 +44,56 @@
 __all__ = ["from_paddle"]
 
 
-def _get_pad_size(in_size, dilated_kernel_size, stride_size):
-    """Calculate the paddings size for Conv/Pool in SAME padding mode."""
+def _autopad(

Review comment:
       Good advice, I think this function also works for tensorflow and tflite to solve dynamic shape problem. 
   
   `shape_of` and `autopad` are both removed to `common.py`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] jiangjiajun commented on a change in pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

Posted by GitBox <gi...@apache.org>.
jiangjiajun commented on a change in pull request #9295:
URL: https://github.com/apache/tvm/pull/9295#discussion_r734452629



##########
File path: python/tvm/relay/frontend/paddlepaddle.py
##########
@@ -248,24 +287,13 @@ def convert_conv2d(g, op, block):
     if padding_algorithm == "VALID":
         paddings = [0, 0]
     elif padding_algorithm == "SAME":
-        if strides[0] == 1 and strides[1] == 1:
-            pad_h = _get_pad_size(0, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(0, (k_w - 1) * dilations[1] + 1, strides[1])
-        else:
-            input_shape = shape_of(input_x)
-            h_w = _op.strided_slice(input_shape, [2], [4])
-            try:
-                in_h, in_w = infer_value(h_w, g.get_params()).numpy().tolist()
-            except Exception as e:
-                msg = "Dynamic shape is not supported in SAME padding algorithm while stride!=1"
-                raise tvm.error.OpAttributeInvalid(msg) from e
-            pad_h = _get_pad_size(in_h, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(in_w, (k_w - 1) * dilations[1] + 1, strides[1])
-        paddings = [pad_h[0], pad_w[0], pad_h[1], pad_w[1]]
+        dilations = [1, 1]

Review comment:
       To avoid confusion, I  put a comment on this line of code to explain this problem.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] jiangjiajun commented on a change in pull request #9295: [Frontend][PaddlePaddle] Add autopad for conv/pool

Posted by GitBox <gi...@apache.org>.
jiangjiajun commented on a change in pull request #9295:
URL: https://github.com/apache/tvm/pull/9295#discussion_r734444151



##########
File path: python/tvm/relay/frontend/paddlepaddle.py
##########
@@ -248,24 +287,13 @@ def convert_conv2d(g, op, block):
     if padding_algorithm == "VALID":
         paddings = [0, 0]
     elif padding_algorithm == "SAME":
-        if strides[0] == 1 and strides[1] == 1:
-            pad_h = _get_pad_size(0, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(0, (k_w - 1) * dilations[1] + 1, strides[1])
-        else:
-            input_shape = shape_of(input_x)
-            h_w = _op.strided_slice(input_shape, [2], [4])
-            try:
-                in_h, in_w = infer_value(h_w, g.get_params()).numpy().tolist()
-            except Exception as e:
-                msg = "Dynamic shape is not supported in SAME padding algorithm while stride!=1"
-                raise tvm.error.OpAttributeInvalid(msg) from e
-            pad_h = _get_pad_size(in_h, (k_h - 1) * dilations[0] + 1, strides[0])
-            pad_w = _get_pad_size(in_w, (k_w - 1) * dilations[1] + 1, strides[1])
-        paddings = [pad_h[0], pad_w[0], pad_h[1], pad_w[1]]
+        dilations = [1, 1]

Review comment:
       This is a history issue for Paddle framework, while padding==`SAME`, it will force dliations == 1
   Here is the implementation code https://github.com/PaddlePaddle/Paddle/blob/develop/paddle/fluid/operators/conv_op.h#L113
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org