You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2022/03/28 12:23:10 UTC

[GitHub] [incubator-mxnet] bgawrych commented on a change in pull request #20952: Restore quantized RNN to master

bgawrych commented on a change in pull request #20952:
URL: https://github.com/apache/incubator-mxnet/pull/20952#discussion_r836368834



##########
File path: src/operator/nn/dnnl/dnnl_rnn.cc
##########
@@ -449,11 +482,19 @@ void DNNLRnnForward::SetNewDataMem(void* x,
   auto& cpu_engine      = CpuEngine::Get()->get_engine();
   dnnl_args_map_t& args = net_args_;
 
+  int src_dtype = dtype;
+  int dst_dtype = dtype;
+  if (param_.quantized) {
+    src_dtype = mshadow::kUint8;

Review comment:
       Now it does - seems like int8 is supported since 2.3 (https://oneapi-src.github.io/oneDNN/v2.2/dev_guide_rnn.html / https://oneapi-src.github.io/oneDNN/v2.3/dev_guide_rnn.html)
   
   New task can be opened for int8 implementation and checking if it is worth




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org