You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2020/03/10 19:28:36 UTC

[GitHub] [incubator-mxnet] ChaiBapchya commented on issue #17805: fixing batch_norm and layer_norm for large tensor nightly test

ChaiBapchya commented on issue #17805: fixing batch_norm and layer_norm for large tensor nightly test
URL: https://github.com/apache/incubator-mxnet/pull/17805#issuecomment-597270999
 
 
   1. How is addition of SHAPE_ASSIGN_CHECK to layer_norm causing this failure?
   Layer norm/batch norm were passing before and some change caused it to start to fail right? What's that root cause?
   
   2. Also it turns out - batch norm already has shape check in test_large_array.py
   https://github.com/apache/incubator-mxnet/blob/afb8742e6e1e987833b39c487dc892b5537196a1/tests/nightly/test_large_array.py#L327
   
   Layer norm doesn't have such a check in test_large_array.py. Maybe you could add that. 
   
   Fundamentally, For both batch norm and layer norm, since the operation is just to perform normalization over layer/batch, input shape should be equal to output shape.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services