You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2022/06/24 14:07:54 UTC

[GitHub] [incubator-mxnet] bartekkuncer commented on a diff in pull request #20976: [FEATURE] Add _npi_power_scalar and _npi_multiply_scalar fuse

bartekkuncer commented on code in PR #20976:
URL: https://github.com/apache/incubator-mxnet/pull/20976#discussion_r906102338


##########
src/operator/nn/dnnl/dnnl_pow_mul_scalar.cc:
##########
@@ -18,49 +18,59 @@
  */
 
 /*!
- * \file dnnl_power_scalar.cc
- * \author: Adam Grabowski, adam.grabowski@intel.com
+ * \file dnnl_pow_mul_scalar.cc
  */
 
 #if MXNET_USE_ONEDNN == 1
 
-#include "dnnl_power_scalar-inl.h"
+#include "dnnl_pow_mul_scalar-inl.h"
 
 namespace mxnet {
 namespace op {
 
-DNNLPowerFwd& DNNLPowerFwd::GetPowerForward(const nnvm::NodeAttrs& attrs,
-                                            const NDArray& input,
-                                            const NDArray& output) {
-  const NumpyBinaryScalarParam& param = nnvm::get<NumpyBinaryScalarParam>(attrs.parsed);
+bool SupportDNNLPower(const NDArray& input) {
+  return input.shape().Size() != 0 && input.shape().ndim() > 0 && input.shape().ndim() <= 6 &&
+         input.dtype() == mshadow::kFloat32;
+}

Review Comment:
   Good catch, thanks! :)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org