You are viewing a plain text version of this content. The canonical link for it is here.
Posted to discuss-archive@tvm.apache.org by wwq666 via Apache TVM Discuss <no...@discuss.tvm.ai> on 2022/03/24 09:26:54 UTC

[Apache TVM Discuss] [Questions] Multithreaded inference


Does TVM support multithreaded inference? That is to have each thread load a precompiled .so into a module. Given the per thread modules, inference by the set_input, run, and get_output pattern. Thanks!





---
[Visit Topic](https://discuss.tvm.apache.org/t/multithreaded-inference/12400/1) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/email/unsubscribe/d2f02f382a17de3dc84496ca472ed1c963bffe1a3ddd94b175339db8d8e01eba).