You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2020/04/24 21:18:40 UTC

[GitHub] [incubator-tvm] mbrookhart opened a new pull request #5441: Add TopK to ONNX Frontend

mbrookhart opened a new pull request #5441:
URL: https://github.com/apache/incubator-tvm/pull/5441


   @masahi @jwfromm
   
   Thanks!
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] jwfromm commented on a change in pull request #5441: Add TopK to ONNX Frontend

Posted by GitBox <gi...@apache.org>.
jwfromm commented on a change in pull request #5441:
URL: https://github.com/apache/incubator-tvm/pull/5441#discussion_r414874263



##########
File path: python/tvm/relay/frontend/onnx.py
##########
@@ -1470,6 +1470,23 @@ def _impl_v9(cls, inputs, attr, params):
         output = AttrCvt(op_name='argwhere')(inputs, attr, params)
         return _op.transpose(output, axes=(1, 0))
 
+class TopK(OnnxOpConverter):
+    """Operator converter for TopK
+    """
+    @classmethod
+    def _impl_v1(cls, inputs, attr, params):
+        if len(inputs) != 2:
+            raise ValueError("Expect 2 input only")
+        inp = inputs[0]
+        axis = attr.get("axis", len(infer_shape(inp)) - 1)
+        largest = attr.get("largest", 1)
+
+        if largest == 0:
+            raise ValueError("TVM only supports finding TopK largest elements")

Review comment:
       maybe assert that `largest > 0` to catch negative cases as well.

##########
File path: python/tvm/relay/frontend/onnx.py
##########
@@ -1470,6 +1470,23 @@ def _impl_v9(cls, inputs, attr, params):
         output = AttrCvt(op_name='argwhere')(inputs, attr, params)
         return _op.transpose(output, axes=(1, 0))
 
+class TopK(OnnxOpConverter):
+    """Operator converter for TopK
+    """
+    @classmethod
+    def _impl_v1(cls, inputs, attr, params):
+        if len(inputs) != 2:
+            raise ValueError("Expect 2 input only")
+        inp = inputs[0]
+        axis = attr.get("axis", len(infer_shape(inp)) - 1)

Review comment:
       why not use `-1` as axis instead of inferring shape here?

##########
File path: tests/python/frontend/onnx/test_forward.py
##########
@@ -2330,6 +2330,44 @@ def verify_nonzero(indata, outdata, dtype):
     result = np.array((np.nonzero(input_data)))  # expected output [[0, 1, 2, 2], [0, 1, 0, 1]]
     verify_nonzero(input_data, result, dtype=np.int64)
 
+def test_topk():
+    def verify_topk(input_dims, K, axis=-1):
+        output_dims = list(input_dims)
+        output_dims[axis] = K
+
+        node = helper.make_node('TopK',
+                                inputs=['X', 'K'],
+                                outputs=['Values', 'Indicies'],
+                                axis=axis)
+
+        graph = helper.make_graph([node],
+                                  "topk_test",
+                                  inputs=[helper.make_tensor_value_info("X", TensorProto.FLOAT, list(input_dims)),
+                                          helper.make_tensor_value_info("K", TensorProto.INT64, [1,])],
+                                  initializer=[helper.make_tensor("K", TensorProto.INT64, [1], [K])],
+                                  outputs=[helper.make_tensor_value_info("Values", TensorProto.FLOAT, output_dims), 
+                                           helper.make_tensor_value_info("Indicies", TensorProto.INT64, output_dims)])
+
+        model = helper.make_model(graph, producer_name='topk_test')
+
+        indata = np.random.uniform(-10, 10, input_dims).astype(np.float32)
+        onnx_out = get_onnxruntime_output(model, [indata, k])
+        print(onnx_out)
+
+        for target, ctx in [('llvm', tvm.cpu())]:
+            tvm_out = get_tvm_output(model, indata, target, ctx, [output_dims, output_dims], 
+                    output_dtype=['float32', 'int64'])
+            print(tvm_out)
+            tvm.testing.assert_allclose(onnx_out, tvm_out, rtol=1e-05, atol=1e-05)
+    
+    for shape in [[10], [10, 10], [10, 10, 10]]:
+        for k in [1, 5, 10]:
+            verify_topk(shape, k)
+
+    verify_topk([10,10,10], 5, 0)
+    verify_topk([10,10,10], 5, 1)
+    verify_topk([10,10,10], 5, 2)

Review comment:
       It's probably worth testing at least one other input dimension.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] mbrookhart commented on a change in pull request #5441: Add TopK to ONNX Frontend

Posted by GitBox <gi...@apache.org>.
mbrookhart commented on a change in pull request #5441:
URL: https://github.com/apache/incubator-tvm/pull/5441#discussion_r414914928



##########
File path: python/tvm/relay/frontend/onnx.py
##########
@@ -1470,6 +1470,23 @@ def _impl_v9(cls, inputs, attr, params):
         output = AttrCvt(op_name='argwhere')(inputs, attr, params)
         return _op.transpose(output, axes=(1, 0))
 
+class TopK(OnnxOpConverter):
+    """Operator converter for TopK
+    """
+    @classmethod
+    def _impl_v1(cls, inputs, attr, params):
+        if len(inputs) != 2:
+            raise ValueError("Expect 2 input only")
+        inp = inputs[0]
+        axis = attr.get("axis", len(infer_shape(inp)) - 1)
+        largest = attr.get("largest", 1)
+
+        if largest == 0:
+            raise ValueError("TVM only supports finding TopK largest elements")

Review comment:
       It's a boolean value stored as an int. Anything but zero is technically true, and I only want to throw on the false case.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] masahi commented on pull request #5441: Add TopK to ONNX Frontend

Posted by GitBox <gi...@apache.org>.
masahi commented on pull request #5441:
URL: https://github.com/apache/incubator-tvm/pull/5441#issuecomment-619330728


   Thanks @mbrookhart @jwfromm 
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] mbrookhart commented on pull request #5441: Add TopK to ONNX Frontend

Posted by GitBox <gi...@apache.org>.
mbrookhart commented on pull request #5441:
URL: https://github.com/apache/incubator-tvm/pull/5441#issuecomment-619418119


   Thank you!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org