You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/10/11 04:08:24 UTC

[GitHub] trtcrd opened a new issue #6783: mx.set.seed() does not work using GPU

trtcrd opened a new issue #6783: mx.set.seed() does not work using GPU
URL: https://github.com/apache/incubator-mxnet/issues/6783
 
 
   Hello everyone, 
   
   I am trying to task mxnet R package 10.1 with a regression.
   mxnet has been compiled on ubuntu 16.04 using the official tutorial using a titan xp nvidia GPU, and accessed with Rstudio-server.
   
   For some reason that I cannot understand, the mx.set.seed() function does not work when using the GPU, while it works using the CPU
   
   Here is a code that reproduce the issue on the machine, with (very) quick and (very) dirty data generation:
   
   
   ```
   # Data
   nObs <- 500
   
   # train data
   train.x <- runif(nObs)
   for (i in 2:500) train.x <- cbind(train.x, runif(nObs))
   train.y <- runif(nObs)
   
   #test data
   test.x <- runif(nObs)
   for (i in 2:500) test.x <- cbind(test.x, runif(nObs))
   
   #prediction on gpu
   pred_gpu <- c()
   
   for (i in 1:3)
   {
     mx.set.seed(1234)
     mod <- mx.mlp(as.matrix(train.x), train.y, 
                   device = mx.gpu(), 
                   verbose = T, 
                   dropout= 0.1,
                   momentum=0.01, 
                   array.layout="rowmajor", 
                   learning.rate=0.01,
                   hidden_node=100, 
                   out_node=1, 
                   num.round=100, 
                   activation="sigmoid",
                   out_activation='rmse', 
                   eval.metric=mx.metric.rmse)
     pred <- predict(mod, as.matrix(test.x), array.layout="rowmajor")
     pred_gpu <- rbind(pred_gpu, pred)
   }
   
   
   #prediction on cpu
   pred_cpu <- c()
   
   for (i in 1:3)
   {
     mx.set.seed(1234)
     mod <- mx.mlp(as.matrix(train.x), train.y, 
                   device = mx.cpu(), 
                   verbose = T, 
                   dropout= 0.1,
                   momentum=0.01, 
                   array.layout="rowmajor", 
                   learning.rate=0.01,
                   hidden_node=100, 
                   out_node=1, 
                   num.round=100, 
                   activation="sigmoid",
                   out_activation='rmse', 
                   eval.metric=mx.metric.rmse)
     pred <- predict(mod, as.matrix(test.x), array.layout="rowmajor")
     pred_cpu <- rbind(pred_cpu, pred)
   }
   
   print(pred_gpu[,1])
   0.5065710 0.5087389 0.5078545
   print(pred_cpu[,1])
   0.5073739 0.5073739 0.5073739
   
   ```
   
   Is there something that I am doing wrong? 
   Thanks a lot in advance!
   
   ## Environment info
   Operating System: ubuntu 16.04
   
   Compiler: gcc
   
   Package used (Python/R/Scala/Julia): R
   
   MXNet version: 10.1
   
   Or if installed from source:
   
   MXNet commit hash (git rev-parse HEAD):
   
   R sessionInfo():
   
   R version 3.4.0 (2017-04-21)
   Platform: x86_64-pc-linux-gnu (64-bit)
   Running under: Ubuntu 16.04.2 LTS
   
   Matrix products: default
   BLAS: /usr/lib/openblas-base/libblas.so.3
   LAPACK: /usr/lib/libopenblasp-r0.2.18.so
   
   locale:
   [1] LC_CTYPE=en_US.UTF-8 LC_NUMERIC=C LC_TIME=en_US.UTF-8 LC_COLLATE=en_US.UTF-8
   [5] LC_MONETARY=en_US.UTF-8 LC_MESSAGES=en_US.UTF-8 LC_PAPER=en_US.UTF-8 LC_NAME=C
   [9] LC_ADDRESS=C LC_TELEPHONE=C LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C
   2.
   3.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services