You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by tq...@apache.org on 2020/10/08 12:38:49 UTC

[incubator-tvm] branch master updated: Fix leakyReLU support for CoreML (#6651)

This is an automated email from the ASF dual-hosted git repository.

tqchen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-tvm.git


The following commit(s) were added to refs/heads/master by this push:
     new dffdb23  Fix leakyReLU support for CoreML (#6651)
dffdb23 is described below

commit dffdb23b64f81df758a2fd86107f18d8f63e35bf
Author: Ishi Tatsuyuki <is...@gmail.com>
AuthorDate: Thu Oct 8 21:37:57 2020 +0900

    Fix leakyReLU support for CoreML (#6651)
    
    The original implementation failed with the following error:
    
    File "../include/tvm/runtime/packed_func.h", line 372
    TVMError: Check failed: type_code_ == kDLFloat (8 vs. 2) : expected float but get Object
---
 python/tvm/relay/frontend/coreml.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/python/tvm/relay/frontend/coreml.py b/python/tvm/relay/frontend/coreml.py
index e510d6a..4efe014 100644
--- a/python/tvm/relay/frontend/coreml.py
+++ b/python/tvm/relay/frontend/coreml.py
@@ -138,7 +138,7 @@ def _ActivationParams(op, inexpr, etab):
     if whichActivation == "ReLU":
         return _op.nn.relu(inexpr)
     if whichActivation == "leakyReLU":
-        _op.nn.leaky_relu(inexpr, alpha=_expr.const(par.alpha, dtype="float32"))
+        return _op.nn.leaky_relu(inexpr, alpha=par.alpha)
     elif whichActivation == "thresholdedReLU":
         alpha_tensor = _op.full_like(inexpr, fill_value=_expr.const(par.alpha, dtype="float32"))
         return _op.multiply(inexpr, _op.greater(inexpr, alpha_tensor).as_type("float32"))