You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/03/30 04:02:28 UTC

[GitHub] QiXuanWang opened a new issue #10329: Can't print params on gpu with collect_params

QiXuanWang opened a new issue #10329: Can't print params on gpu with collect_params
URL: https://github.com/apache/incubator-mxnet/issues/10329
 
 
   Note: Providing complete information in the most concise form is the best way to get help. This issue template serves as the checklist for essential information to most of the technical issues and bug reports. For non-technical issues and feature requests, feel free to present the information in what you believe is the best form.
   
   For Q & A and discussion, please start a discussion thread at https://discuss.mxnet.io 
   
   ## Description
   core-dumped after calling print(params).
   
   ## Environment info (Required)
   centos7.2
   mxnet 1.0
   python3.6.1
   Platform     : Linux-3.10.0-514.6.1.el7.x86_64-x86_64-with-centos-7.2.1511-Core
   
   
   
   Package used (Python/R/Scala/Julia):
   Python3.6.1
   
   ## Error Message:
   
   terminate called after throwing an instance of 'dmlc::Error'
   terminate called after throwing an instance of 'dmlc::Error  what():  driver shutting down
   '
   Abort (core dumped)
   
   
   ## Minimum reproducible example
   (If you are using your own code, please provide a short script that reproduces the error. Otherwise, please provide link to the existing example.)
   
   ## Steps to reproduce
   Code here:
   
   import mxnet as mx
   from mxnet import gluon
   ctx = mx.gpu()
   net = gluon.nn.Sequential()
   with net.name_scope():
       net.add(gluon.rnn.LSTM(3, 1))
   net.collect_params().initialize(mx.init.Xavier(), ctx=ctx)
   params = net.collect_params()
   print(params)
   
   
   ## What have you tried to solve it?
   1. change ctx to cpu would solve it
   2. trying to use as_in_context but I need this to debug another issue that report non-initialized parameter during training
   3. This possibly fixed in later version but I don't have 1.1 installed. Please report correct message  if my usage is not correct
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services