You are viewing a plain text version of this content. The canonical link for it is here.
Posted to discuss-archive@mxnet.apache.org by ali via MXNet Forum <mx...@discoursemail.com.INVALID> on 2020/12/16 18:11:52 UTC

[MXNet Forum] Out of Memory when using gluoncv.data.COCODetection


I followed the example to load the COCO data sets as follows:

    from gluoncv import data, utils
    train_dataset = data.COCODetection(splits=['instances_train2017'])
    val_dataset = data.COCODetection(splits=['instances_val2017'])

The train_dataset is about 117266 images and labels.
Now I am simply trying to loop through the data-set:

    for i in range(len(train_dataset)):
        train_image, train_label = train_dataset[i]
    

This loop simply eats up all of my 15Gb of RAM and I am not even 60% of data. 

How can I make sure that my memory is used properly or how can i fix a batch size etc?





---
[Visit Topic](https://discuss.mxnet.apache.org/t/out-of-memory-when-using-gluoncv-data-cocodetection/6778/1) or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.mxnet.apache.org/email/unsubscribe/6373da637b9ad7269b59f4ba0a49689df37efeacb17f0231a0cdc8669232abf4).