You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/10/31 09:58:36 UTC

[GitHub] [incubator-mxnet] zixuanweeei commented on issue #16578: Fused RNN Operators have nonsupport of `add` grad_req with mkl-dnn

zixuanweeei commented on issue #16578: Fused RNN Operators have nonsupport of `add` grad_req with mkl-dnn
URL: https://github.com/apache/incubator-mxnet/issues/16578#issuecomment-548293860
 
 
   > @TaoLv , I suggest you got this issue verified once you have [Upgrade MKL-DNN dependency to v1.0 #16555] merged.
   
   Thanks for your suggestion. MKL-DNN RNN operators do have this issue, and it will terminate the program when meets `add`. I have not met some cases that use `add` for RNN training. It will be highly appreciated if you can provide some information.
   Currently, we need to deliver the gradients from mkl-dnn space to MXNet native space. It requires further design to guarantee performance and accuracy. We prefer to have #16555 merged, and then accomplish the `add` operation.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services