You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2017/12/01 08:37:42 UTC

[GitHub] huyangc opened a new issue #8907: What happen when change optimizer.lr locally without using kvstore.set_optimizer to update?

huyangc opened a new issue #8907: What happen when change optimizer.lr locally without using kvstore.set_optimizer to update?
URL: https://github.com/apache/incubator-mxnet/issues/8907
 
 
   I am running on a single machine with multi gpus. And the kvstore is set to be device.
   
   I manually set the optimizer.lr without using any scheduler and without using kvstore.set_optimizer to update the optimizer to the kvstore. Just like some codes below.
   ```python
   optimizer = mx.optimizer.create('sgd',
                                       rescale_grad=1.0 / batch_size,
                                       learning_rate=0.0005,  # don't forget to change the learning when finetuned
                                       momentum=0.9,
                                       wd=0.0005)
   mod.init_optimizer('device', optimizer)
   # after some epoch
   optimizer.lr *= 0.1
   ```
   
   When the lr changed, the loss of model decrease, I think the code for changing learning rate work. However, when I delving deep into mxnet, I find that it shouldn't work.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services