You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/10/26 02:46:59 UTC

[GitHub] [incubator-mxnet] DickJC123 commented on issue #16532: fix dropout gpu seed

DickJC123 commented on issue #16532: fix dropout gpu seed
URL: https://github.com/apache/incubator-mxnet/pull/16532#issuecomment-546561285
 
 
   @roywei I've been playing with a variant of your proposed test in which I set the seed to two different values for the two models and expect the results to be different.  This fails, because the results are identical even with the differing seeds.  The two models each get their own gpu random resource, but the two are seeded by cpu random number generators that are identical.
   
   The problem here is that the cpu rng's are not responding to mx.random.seed(), and instead have their seed set to 0.  The reason is that cpu rngs are requested from the ResourceManager, and ResourceManager is a thread-local variable.  The main python thread (performing the mx.random.seed()) only affects the **global_seed_** data member for its ResourceManager instance, which does not affect the seeds of the cpu rngs that are requested of the worker thread's ResourceManager.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services