You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2021/05/18 22:16:31 UTC

[GitHub] [tvm] comaniac commented on a change in pull request #8030: [TOPI] Custom schedule for standalone transpose in cuda

comaniac commented on a change in pull request #8030:
URL: https://github.com/apache/tvm/pull/8030#discussion_r634787135



##########
File path: vta/tutorials/autotvm/tune_relay_vta.py
##########
@@ -357,7 +357,7 @@ def tune_and_evaluate(tuning_opt):
     )
 
     # filter out non-packed conv2d task
-    tasks = list(filter(lambda t: len(t.args[0][1]) > 4, tasks))
+    tasks = list(filter(lambda t: len(t.args[0][1]) > 4 and "conv" in t.name, tasks))

Review comment:
       Isn't the new added schedule not tunable? Or is there any concern of adding knobs?

##########
File path: tests/python/topi/python/test_topi_transform.py
##########
@@ -870,6 +871,31 @@ def test_transpose():
     verify_transpose((3, 10), None)
 
 
+@tvm.testing.parametrize_targets
+def test_transpose_schedule(target, dev):
+    shape = (100, target.thread_warp_size + 3)
+    x = relay.var("x", relay.TensorType(shape, "float32"))
+    f = relay.transpose(x)
+    ex = relay.create_executor(
+        kind="graph", mod=tvm.IRModule.from_expr(relay.Function([x], f)), device=dev, target=target
+    )
+    r = np.random.rand(*shape)
+    tvm.testing.assert_allclose(ex.evaluate()(r).asnumpy(), np.transpose(r))
+
+    # We want to make sure schedule does not fire here, but there is no way of
+    # inspecting which schedules were used.

Review comment:
       Like this comment mentions, there is no way of inspecting which schedules were used, so it seems to me that the difference between this test and `test_transpose` is the workload in this test includes `add` to test the case of fusion. Accordingly, could we just extend `test_transpose`?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org