You are viewing a plain text version of this content. The canonical link for it is here.
Posted to discuss-archive@mxnet.apache.org by yamada via MXNet Forum <mx...@discoursemail.com.INVALID> on 2020/06/16 05:27:44 UTC

[MXNet Forum] How to limit GPU memory usage


Thank you for your reply.
Following your advice,I tried running.

**MXNET_GPU_MEM_POOL_TYPE:Not set**
**MXNET_GPU_MEM_POOL_RESERVE:60**
(I confirmed with the set command of anaconda)

However, the usage rate did not fall below 40% and did not change...(81%)
What's wrong with my setup.

Attached the confirmation result of GPU usage rate
![mxnet_gpu_LI|376x500](upload://pgEq53kmQztETCvEBtZw22FyBc5.jpeg)





---
[Visit Topic](https://discuss.mxnet.io/t/how-to-limit-gpu-memory-usage/6304/3) or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.mxnet.io/email/unsubscribe/8f602584d2d535e26038bd1aa2bcd5e7a6cfc1d7cee1bdb51350e0bab81b98c7).

[MXNet Forum] How to limit GPU memory usage

Posted by Smh via MXNet Forum <mx...@discoursemail.com.INVALID>.

You can have a look to this paper [sublinear memory usage](https://arxiv.org/pdf/1604.06174.pdf), which include some common solutions used by DL frames to lower gpu memory usage.

If you only care about `Forward Inference`, you can try to change the batchsize to a small value (at the cost of speed); quantize the network (int8, float16 etc.), as far as I know, the mkldnn backend of mxnet support this, otherwise, tensorrt also have a good support to model quantization. Some other methods you can also have a try.

How about the gpu memory cost of same model using TF or PyTorch?





---
[Visit Topic](https://discuss.mxnet.io/t/how-to-limit-gpu-memory-usage/6304/6) or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.mxnet.io/email/unsubscribe/7f545ba3ae224bf60f2f3a069883f19ce8cddb94ff713d3d34151207ae00d5d1).

[MXNet Forum] How to limit GPU memory usage

Posted by Jason Ho via MXNet Forum <mx...@discoursemail.com.INVALID>.

@TriLoon  Any other thoughts on this? It seems like you have quite a bit more experience, but we're facing a similar issue.

MXNET on InsightFace is taking up around 7GB memory on inference. That seems large?





---
[Visit Topic](https://discuss.mxnet.io/t/how-to-limit-gpu-memory-usage/6304/5) or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.mxnet.io/email/unsubscribe/5eb84fde89288dabc5546f9f792ed9d0161a522ae2a59a1debed0a8c3ea43b27).

[MXNet Forum] How to limit GPU memory usage

Posted by Smh via MXNet Forum <mx...@discoursemail.com.INVALID>.

I believe the `MXNET_GPU_MEM_POOL_RESERVE` environment variable is just a hint to `MXNet` to release all ever allocated but now **FREED** gpu memories, which is served as possible reuse in the future.  If your network is large (for example, the gpu cannot run two such networks  simultaneously), one way to handle this is to implement the `memory sharing` between shallow layers and deeper layers' calculation, or more aggressive `inplace` operators, either of these two methods could involve some workload to the DL frameworks.

BTW, have you tried the network using `TF` with same GPU memory budget without an `OOM` error raised ? As far as I know, the MXNet already have a good implementation to use less gpu memory.

And, the `GPU Load` means the calculation ability (for example, the cuda cores) used by current application, but not memory used by 81 %  in my opinion, where higher means better use of GPU.





---
[Visit Topic](https://discuss.mxnet.io/t/how-to-limit-gpu-memory-usage/6304/4) or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.mxnet.io/email/unsubscribe/bbb47aa1f63b1848f22361e42d3ef3c9e0e33f5cd6123711b19c9b4d0cb3bb11).

[MXNet Forum] How to limit GPU memory usage

Posted by yamada via MXNet Forum <mx...@discoursemail.com.INVALID>.

Thanks for pointing that out.
I've received other suggestions that we should look at "Memory Used.

I'll check it out.





---
[Visit Topic](https://discuss.mxnet.io/t/how-to-limit-gpu-memory-usage/6304/9) or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.mxnet.io/email/unsubscribe/e6dc790ac1bbef38a4260975597e2d99d3f5e29d8204e4d32358211ec1f42a09).

[MXNet Forum] How to limit GPU memory usage

Posted by Neutron via MXNet Forum <mx...@discoursemail.com.INVALID>.

You should refer to `Memory Used`(how much space is used) rather than `GPU Load`(how much percent of time is used)
Notice that `7626/.6=12710`, the environment **MXNET_GPU_MEM_POOL_RESERVE:60** actually works





---
[Visit Topic](https://discuss.mxnet.io/t/how-to-limit-gpu-memory-usage/6304/8) or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.mxnet.io/email/unsubscribe/61d96e356f667559df50897628257a5b0ab34a0e1c3455bd437e429d8629cd7e).