You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/07/23 21:47:52 UTC

[GitHub] [incubator-mxnet] larroy commented on a change in pull request #15516: Fix memory leak reported by ASAN in NNVM to ONNX conversion

larroy commented on a change in pull request #15516: Fix memory leak reported by ASAN in NNVM to ONNX conversion
URL: https://github.com/apache/incubator-mxnet/pull/15516#discussion_r306547998
 
 

 ##########
 File path: src/operator/subgraph/tensorrt/nnvm_to_onnx.cc
 ##########
 @@ -519,21 +519,18 @@ void ConvertConstant(
   auto size = shape.Size();
 
   if (dtype == TensorProto_DataType_FLOAT) {
-    std::shared_ptr<float> shared_data_ptr(new float[size]);
-    float* const data_ptr = shared_data_ptr.get();
-    nd.SyncCopyToCPU(static_cast<void*>(data_ptr), size);
-
-    for (size_t blob_idx = 0; blob_idx < size; ++blob_idx) {
-      initializer_proto->add_float_data(data_ptr[blob_idx]);
-    }
+      std::vector<float> constants(size);
+      nd.SyncCopyToCPU(&constants, size);
 
 Review comment:
   🙅‍♂️this is not correct. You are taking address to the vector itself, you want to take address to the data.
   
   Please use .data() member or the address of the first element. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services