You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@mxnet.apache.org by "Chris Olivier (JIRA)" <ji...@apache.org> on 2018/03/06 16:15:00 UTC
[jira] [Updated] (MXNET-4) Refactor Random and ParallelRandom
resources to use MKL for MKL builds
[ https://issues.apache.org/jira/browse/MXNET-4?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Chris Olivier updated MXNET-4:
------------------------------
Component/s: MXNet Engine
> Refactor Random and ParallelRandom resources to use MKL for MKL builds
> ----------------------------------------------------------------------
>
> Key: MXNET-4
> URL: https://issues.apache.org/jira/browse/MXNET-4
> Project: Apache MXNet
> Issue Type: Improvement
> Components: MXNet Engine
> Reporter: Chris Olivier
> Priority: Major
> Labels: mkl, performance
>
> Refactor Random and ParallelRandom resources to use MKL for MKL builds
> Things such as RngUniform, etc. Similarly to what is done for dropout operator.
> It may need to allocate some temporary memory and generate random numbers in batches, then serving them out from that batch.
> Also the Random classes could export a "fill buffer with randoms" function, which seems to be a common use-case and fits the MKL API more closely.
> Care must be taken regarding MKL's fixed output types for some of the API functions.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@mxnet.apache.org
For additional commands, e-mail: issues-help@mxnet.apache.org