You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/07/17 12:26:03 UTC

[GitHub] [incubator-mxnet] cloudhan commented on issue #9686: [Discussion] MXNet 2.0 Roadmap (was: APIs that might be a good idea to break in 2.0)

cloudhan commented on issue #9686: [Discussion] MXNet 2.0 Roadmap (was: APIs that might be a good idea to break in 2.0)
URL: https://github.com/apache/incubator-mxnet/issues/9686#issuecomment-512232079
 
 
   > I think we should provide a user-friendly thread-safe inference API for deploying in c++, java, etc. We can focus on naive engine in inference since it's very hard to refactor threaded engine to be thread-safe. A good and easy-to-use executor should have the following properties:
   > 
   > * One instance of the executor is enough for multi-threaded inference, which means it can be used simultaneously in different threads.
   > * The immutable ndarray in executor should be shared in multi-threaded inference to save memory footprint.
   > 
   > Now we have `MXPredCreateMultiThread` in C API, but it's buggy and we still need to create multiple executors for each thread.
   
   Sounds like refactor execution with TBB and adding some buffering mechanism?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services