You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2022/04/25 18:16:14 UTC

[GitHub] [tvm] AndrewZhaoLuo commented on pull request #11109: [Frontend][ONNX]fix dropout have 3 arguments

AndrewZhaoLuo commented on PR #11109:
URL: https://github.com/apache/tvm/pull/11109#issuecomment-1108888058

   So dropout at inference time should really be a no-op. Therefore, as TVM is primarily an inference framework (for now) we don't really handle dropout. In fact, somewhere along the line all dropout ops are replaced with no-ops. 
   
   I recommend, when you export your model, you set it in inference mode to solve this issue.
   
   Dropout has always been a tricky operator and I would not worry about it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org