You are viewing a plain text version of this content. The canonical link for it is here.
Posted to discuss-archive@tvm.apache.org by Angus He Zz via TVM Discuss <no...@discuss.tvm.ai> on 2020/06/22 11:09:41 UTC

[TVM Discuss] [Questions] Why doesn't nn.layer_norm have TOpPattern?


I want to close LayerNormToInferUnpack for layer_norm operator. But when I close the LayerNormToInferUnpack, the TVMError occurs. 
The error log is "  File "/home/zhongzheng.he/Project/stc-tvm/include/tvm/relay/op.h", line 534
TVMError: Check failed: idx < data_.size() && data_[idx].second != 0: Attribute TOpPattern has not been registered for Operator nn.layer_norm
".
So when I add TOpPattern for operator nn.layer_norm, the TVMError does not occur.

This is my patch:

    -- a/src/relay/op/nn/nn.cc
    +++ b/src/relay/op/nn/nn.cc
    @@ -870,6 +870,7 @@ RELAY_REGISTER_OP("nn.layer_norm")
     .add_argument("gamma", "Tensor", "The gamma scale factor.")
     .add_argument("beta", "Tensor", "The beta offset factor.")
     .set_support_level(1)
    +.set_attr<TOpPattern>("TOpPattern", kOpaque)
     .add_type_rel("LayerNorm", LayerNormRel);


So, why doesn't nn.layer_norm have the TOpPattern?





---
[Visit Topic](https://discuss.tvm.ai/t/why-doesnt-nn-layer-norm-have-toppattern/7046/1) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/515bf07c1894ab6e05c69674defb11d91914b2d47ad3ba29efd0f07107dcc549).

[TVM Discuss] [Questions] Why doesn't nn.layer_norm have TOpPattern?

Posted by Angus He Zz via TVM Discuss <no...@discuss.tvm.ai>.

[quote="angusHeZZ, post:1, topic:7046"]
Unpack
[/quote]

I just donot want to unpack the layer_norm operater.





---
[Visit Topic](https://discuss.tvm.ai/t/why-doesnt-nn-layer-norm-have-toppattern/7046/3) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/7a416c1b4626b0ad6f92513a2965cf55f116f1becce0d462aaa30978c097ff98).

[TVM Discuss] [Questions] Why doesn't nn.layer_norm have TOpPattern?

Posted by Thomas V via TVM Discuss <no...@discuss.tvm.ai>.

I think the reason is that you typically want to split the op into the statistics gathering and elementwise operations to fuse the parts it with the surrounding ops and having an op prevents that. That said, I don't think anyone keeps you from changing that, it's just that the other case (splitting the op) is so much more common that noone thought of it.





---
[Visit Topic](https://discuss.tvm.ai/t/why-doesnt-nn-layer-norm-have-toppattern/7046/2) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/418e6fc42bf51248d572915f36d3c0ccc780bba9e13e7859039c5a7a4f580b49).