You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2022/09/28 23:34:38 UTC

[GitHub] [tvm] AndrewZhaoLuo opened a new pull request, #12935: [Relay][Topi] Hook up LayerNorm to new Topi

AndrewZhaoLuo opened a new pull request, #12935:
URL: https://github.com/apache/tvm/pull/12935

   This makes layer_norm relay op dispatch to new topi committed in PR #12864. 
   
   Using TIR CSE elimination pass with FP16 layernorm also necessitates the handling of FP16 type when packing args for CUDA.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] AndrewZhaoLuo commented on a diff in pull request #12935: [Relay][Topi] Hook up LayerNorm to new Topi

Posted by GitBox <gi...@apache.org>.
AndrewZhaoLuo commented on code in PR #12935:
URL: https://github.com/apache/tvm/pull/12935#discussion_r983885064


##########
src/relay/op/nn/nn.cc:
##########
@@ -1023,7 +1023,8 @@ RELAY_REGISTER_OP("nn.layer_norm")
     .set_attr<FInferCorrectLayout>("FInferCorrectLayout",
                                    NormalizationInferCorrectLayout<LayerNormAttrs>)
     .set_support_level(1)
-    .add_type_rel("LayerNorm", LayerNormRel);
+    .add_type_rel("LayerNorm", LayerNormRel)
+    .set_attr<TOpPattern>("TOpPattern", kInjective);

Review Comment:
   Done



##########
src/relay/op/nn/nn.cc:
##########
@@ -1023,7 +1023,8 @@ RELAY_REGISTER_OP("nn.layer_norm")
     .set_attr<FInferCorrectLayout>("FInferCorrectLayout",
                                    NormalizationInferCorrectLayout<LayerNormAttrs>)
     .set_support_level(1)
-    .add_type_rel("LayerNorm", LayerNormRel);
+    .add_type_rel("LayerNorm", LayerNormRel)
+    .set_attr<TOpPattern>("TOpPattern", kInjective);

Review Comment:
   Yes I believe, done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] AndrewZhaoLuo commented on pull request #12935: [Relay][Topi] Hook up LayerNorm to new Topi

Posted by GitBox <gi...@apache.org>.
AndrewZhaoLuo commented on PR #12935:
URL: https://github.com/apache/tvm/pull/12935#issuecomment-1347200737

   @masahi
   
   Ah yes, I will try to get to this during the next week where I will have time.
   
   1. Figure out the behavior for autotvm and autoscheduler.
   2. The topi kernel for fused layernorm is known to be numerically unstable. We should replace it with a more stable version e.g. https://en.wikipedia.org/wiki/Algorithms_for_calculating_variance#Parallel_algorithm first.
   3. https://github.com/apache/tvm/pull/13532 this PR is included here to support fp16 layernorm, but needs some refactoring to avoid casting cost.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] AndrewZhaoLuo commented on pull request #12935: [Relay][Topi] Hook up LayerNorm to new Topi

Posted by GitBox <gi...@apache.org>.
AndrewZhaoLuo commented on PR #12935:
URL: https://github.com/apache/tvm/pull/12935#issuecomment-1262646599

   I have removed the unneeded files. 
   
   Good point about non-metaschedule workflows. I will see what I can do here.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] vinx13 commented on a diff in pull request #12935: [Relay][Topi] Hook up LayerNorm to new Topi

Posted by GitBox <gi...@apache.org>.
vinx13 commented on code in PR #12935:
URL: https://github.com/apache/tvm/pull/12935#discussion_r983865571


##########
src/relay/op/nn/nn.cc:
##########
@@ -1023,7 +1023,8 @@ RELAY_REGISTER_OP("nn.layer_norm")
     .set_attr<FInferCorrectLayout>("FInferCorrectLayout",
                                    NormalizationInferCorrectLayout<LayerNormAttrs>)
     .set_support_level(1)
-    .add_type_rel("LayerNorm", LayerNormRel);
+    .add_type_rel("LayerNorm", LayerNormRel)
+    .set_attr<TOpPattern>("TOpPattern", kInjective);

Review Comment:
   Should be kOutFusible? Because it contains reduction part, it can't be fused into other injective ops



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] masahi commented on pull request #12935: [Relay][Topi] Hook up LayerNorm to new Topi

Posted by GitBox <gi...@apache.org>.
masahi commented on PR #12935:
URL: https://github.com/apache/tvm/pull/12935#issuecomment-1346364850

   Any update? Also interested in using fused layer norm.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org