You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2017/12/30 04:45:40 UTC

[GitHub] kohillyang opened a new issue #9259: Shared memory leak when using mxnet.gluon.data.DataLoader

kohillyang opened a new issue #9259: Shared memory leak when using mxnet.gluon.data.DataLoader
URL: https://github.com/apache/incubator-mxnet/issues/9259
 
 
   ## Description
   I found there may exists shared memory leak in mxnet.gluon.data.DataLoader
   
   ## Environment info (Required)
   Python 2.7.13 , Anaconda
   Nvidia-docker Cuda-8.0
   mxnet-1.0(installed from pip)
   
   
   ## Minimum reproducible example
   ```
   import mxnet as mx
   from mxnet.gluon.data import Dataset
   from mxnet import nd
   from mxnet.gluon.data import DataLoader
   import numpy as np
   class TestIter(Dataset):
       def __init__(self):
           pass
       def __getitem__(self, idx):
           return mx.nd.array(np.zeros(shape = (1,3,1024,1024))), \
               mx.nd.array(np.zeros(shape = (1,3,1024,1024)))
       def __len__(self):
           return 100
   def batch_fn(data):
       imgs = []
       heatmaps = []
       for sample in data:
           img,heatmap = sample
           imgs.append(mx.nd.expand_dims(img,axis = 0))
           heatmaps.append(mx.nd.expand_dims(heatmap,axis = 0))
       data = [mx.nd.concatenate(imgs,axis = 0)]
       label = [mx.nd.concatenate(heatmaps,axis = 0)]
       return mx.io.DataBatch(data = data,label = label)    
   if __name__ == "__main__":
       data_loader = DataLoader(TestIter(),batch_size = 2,shuffle = True,batchify_fn = batch_fn,num_workers = 8)
       for _ in range(1000):
           for batch in data_loader:
               d = batch.data[0].asnumpy()
               l = batch.label[0].asnumpy()
               print(d.shape,l.shape)
   ```
   
   ## Steps to reproduce
   
   This code dead after some time, I found the reason is shared memory usage is increasing when running this code, but if I set num_works to 0, it will be ok. I think this is a bug.
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services