You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/02/20 23:51:35 UTC

[GitHub] roywei opened a new issue #14216: update params failed when params and grads are empty

roywei opened a new issue #14216: update params failed when params and grads are empty
URL: https://github.com/apache/incubator-mxnet/issues/14216
 
 
   This PR https://github.com/apache/incubator-mxnet/pull/13346 has changed the logic for updating parameters [here](https://github.com/apache/incubator-mxnet/pull/13346/files#diff-1455b9a4a68f5a9491ddf841ba3c93c6R162)
   
   However it broke a unit test in downstream project [Keras-MXNet](https://github.com/awslabs/keras-apache-mxnet). I was able to reproduce using pure MXNet:
   
   # Steps to reproduce
   ```
   import mxnet as mx
   from mxnet.io import  DataBatch
   from mxnet import nd
   data = mx.sym.Variable('data')
   out  = mx.sym.Dropout(data, 0.5)
   mod = mx.mod.Module(out)
   mod.bind(data_shapes=[('data',(1,10))])
   mod.init_params()
   mod.init_optimizer()
   data_batch = DataBatch([nd.random.uniform(0,1,(10,10))])
   mod.forward(data_batch)
   mod.backward()
   mod.update()
   ```
   
   will have the following error msg:
   ```
   /Users/lawei/anaconda3/bin/python /Users/lawei/Documents/Workspace/roywei/incubator-mxnet/example/test.py
   /Users/lawei/anaconda3/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
     from ._conv import register_converters as _register_converters
   /Users/lawei/Documents/Workspace/roywei/incubator-mxnet/python/mxnet/module/base_module.py:55: UserWarning: You created Module with Module(..., label_names=['softmax_label']) but input with name 'softmax_label' is not found in symbol.list_arguments(). Did you mean one of:
   	data
     warnings.warn(msg)
   /Users/lawei/Documents/Workspace/roywei/incubator-mxnet/python/mxnet/module/base_module.py:67: UserWarning: Data provided by label_shapes don't match names specified by label_names ([] vs. ['softmax_label'])
     warnings.warn(msg)
   Traceback (most recent call last):
     File "/Users/lawei/Documents/Workspace/roywei/incubator-mxnet/example/test.py", line 13, in <module>
       mod.update()
     File "/Users/lawei/Documents/Workspace/roywei/incubator-mxnet/python/mxnet/module/module.py", line 671, in update
       param_names=self._exec_group.param_names)
     File "/Users/lawei/Documents/Workspace/roywei/incubator-mxnet/python/mxnet/model.py", line 184, in _update_params
       i, w, g = zip(*dev_updates)
   ValueError: not enough values to unpack (expected 3, got 0)
   ```
   
   # Root cause
   when you do update on a operator that does not require params/grads, it will crash due to the updates became a list of empy list `[[]]` [here](https://github.com/apache/incubator-mxnet/pull/13346/files#diff-1455b9a4a68f5a9491ddf841ba3c93c6R165)
   
   Originally when we do param update, if paras and grads are both [], we would just skip it and don't do update.
   
   Although this is not a common usage in MXNet to call module udpate on layers withou params. but I would like to keep the original logic as Keras-MXNet depends on it.
   
   
   ## What have you tried to solve it?
   
   Will create a PR to check empy list
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services