You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2020/11/04 17:55:54 UTC

[GitHub] [incubator-tvm] comaniac commented on a change in pull request #6831: [TVMC] add cl support in tvmc runner

comaniac commented on a change in pull request #6831:
URL: https://github.com/apache/incubator-tvm/pull/6831#discussion_r517519346



##########
File path: python/tvm/driver/tvmc/runner.py
##########
@@ -361,7 +361,12 @@ def run_module(
 
         # TODO expand to other supported devices, as listed in tvm.rpc.client (@leandron)
         logger.debug("device is %s", device)
-        ctx = session.cpu() if device == "cpu" else session.gpu()
+        if device == "gpu":
+            ctx = session.gpu()
+        elif device == "cl":
+            ctx = session.cl()
+        else:
+            ctx = session.cpu()

Review comment:
       ```suggestion
               assert device == "cpu"
               ctx = session.cpu()
   ```

##########
File path: tests/python/driver/tvmc/conftest.py
##########
@@ -70,6 +89,19 @@ def tflite_mobilenet_v1_1_quant(tmpdir_factory):
     return model_file
 
 
+@pytest.fixture(scope="session")
+def tflite_mobilenet_v1_0_25_128(tmpdir_factory):

Review comment:
       Same opinion. I don't see a strong reason of adding an end-to-end model test. Please be aware that running e2e network in the unit test is relatively time-consuming, so we should avoid unnecessary test as possible.

##########
File path: python/tvm/driver/tvmc/compiler.py
##########
@@ -185,11 +185,11 @@ def compile_model(
         with autotvm.apply_history_best(tuning_records):
             with tvm.transform.PassContext(opt_level=3):
                 logger.debug("building relay graph with tuning records")
-                graph_module = relay.build(mod, tvm_target, params=params, target_host=tvm_target)
+                graph_module = relay.build(mod, tvm_target, params=params, target_host=target_host)

Review comment:
       @leandron please be aware that we should include this fix in the back port, but we will not include this PR since this is a new feature. Accordingly, I'd suggest separating this fix.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org