You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2021/06/10 18:59:39 UTC

[GitHub] [tvm] dvhg commented on issue #8233: [Bug] PyTorch MaskRCNN GPU OOM error

dvhg commented on issue #8233:
URL: https://github.com/apache/tvm/issues/8233#issuecomment-858918248


   That's a good point, I didn't think to check memory on CPU targets. Using llvm target, I also see memory usage increase with each inference. After about 300 inferences, the python process consumes ~25% of my 128GB physical RAM. I noticed that the rate of increase seems to slow down but varies a lot depending on the input.
   
   I've also seen this happen with FasterRCNN.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org