You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2021/01/01 02:58:15 UTC

[GitHub] [tvm] insop commented on issue #7176: OpNotImplemented: The following operators are not supported for frontend ONNX: Softplus

insop commented on issue #7176:
URL: https://github.com/apache/tvm/issues/7176#issuecomment-753245149


   `Softplus` is added in 12/10/2020 from this https://github.com/apache/tvm/pull/7089
   - https://github.com/apache/tvm/blob/c02c9c528f91f9be3967b7d9ef9f1847f533590b/python/tvm/relay/frontend/onnx.py#L2102
   
   @junrushao1994 @jwfromm 
   
   However, I see that there were `SoftPlus` (not ethe **P** is in upper case) was already in.
   According to [Onnx spec](https://github.com/onnx/onnx/blob/master/docs/Operators.md), it is `Softplus` not `SoftPlus`.
   I am not sure we need to keep them both (Softplus and SoftPlus).
   
   I have a branch that removed `SoftPlus`, let me know I can create a PR.
   https://github.com/insop/incubator-tvm/commit/1e944644680188f31ada93a7c4ec7de797a1a0e1.patch
   
   ```
   From 1e944644680188f31ada93a7c4ec7de797a1a0e1 Mon Sep 17 00:00:00 2001
   From: Insop Song <x...@y.z>
   Date: Thu, 31 Dec 2020 18:53:33 -0800
   Subject: [PATCH] Remove seemingly invalid SoftPlus
   
   - `Softplus` is added in 12/10/2020 from this https://github.com/apache/tvm/pull/7089
   - However, I see that there were `SoftPlus` (not the P is in capital) was already in.
   According to [Onnx spec](https://github.com/onnx/onnx/blob/master/docs/Operators.md), it is `Softplus` not `SoftPlus`.
   ---
    python/tvm/relay/frontend/onnx.py          | 9 ---------
    tests/python/frontend/onnx/test_forward.py | 1 -
    2 files changed, 10 deletions(-)
   
   diff --git a/python/tvm/relay/frontend/onnx.py b/python/tvm/relay/frontend/onnx.py
   index 6122c81d321..1c544d30971 100644
   --- a/python/tvm/relay/frontend/onnx.py
   +++ b/python/tvm/relay/frontend/onnx.py
   @@ -932,14 +932,6 @@ def _impl_v1(cls, inputs, attr, params):
            return _op.tanh(_expr.const(beta) * inputs[0]) * _expr.const(alpha)
    
    
   -class SoftPlus(OnnxOpConverter):
   -    """Operator converter for SoftPlus."""
   -
   -    @classmethod
   -    def _impl_v1(cls, inputs, attr, params):
   -        return _op.log(_op.exp(inputs[0]) + _expr.const(1.0))
   -
   -
    class Softsign(OnnxOpConverter):
        """Operator converter for Softsign."""
    
   @@ -2661,7 +2653,6 @@ def _get_convert_map(opset):
            "OneHot": OneHot.get_converter(opset),
            # 'Hardmax'
            "Softsign": Softsign.get_converter(opset),
   -        "SoftPlus": SoftPlus.get_converter(opset),
            "Gemm": Gemm.get_converter(opset),
            "MatMul": MatMul.get_converter(opset),
            "Mod": Mod.get_converter(opset),
   diff --git a/tests/python/frontend/onnx/test_forward.py b/tests/python/frontend/onnx/test_forward.py
   index 33dd048896b..3d95a9a83ee 100644
   --- a/tests/python/frontend/onnx/test_forward.py
   +++ b/tests/python/frontend/onnx/test_forward.py
   @@ -1983,7 +1983,6 @@ def verify_single_ops(op, x, out_np, rtol=1e-5, atol=1e-5):
        verify_single_ops("Tanh", x, np.tanh(x))
        verify_single_ops("Sigmoid", x, 1 / (1 + np.exp(-x)))
        verify_single_ops("Softsign", x, x / (1 + np.abs(x)))
   -    verify_single_ops("SoftPlus", x, np.log(1 + np.exp(x)))
    
    
    @tvm.testing.uses_gpu
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org