You are viewing a plain text version of this content. The canonical link for it is here.
Posted to discuss-archive@mxnet.apache.org by Ivan via MXNet Forum <mx...@discoursemail.com.INVALID> on 2020/08/26 15:19:53 UTC

[MXNet Forum] [Gluon] Reinitialize network's weights to be able to differentiate


Hi! 
I'm dealing with nested learning. Right now I'm stuck with a problem that I need to reinitialize network's weights using new values in a such way that I would still be able to differentiate outputs by these new values (via *mxnet.autograd.grad*). What is a feasible way to do it under constraints of using *mxnet.autograd.record*? As using *mxnet.autograd.record* doesn't allow to use any inplace operations (*dst[:]=src* or *set_data*) and using a basic *initialize* method leads to making variables "unreachable from the outputs" in a graph.





---
[Visit Topic](https://discuss.mxnet.io/t/reinitialize-networks-weights-to-be-able-to-differentiate/6541/1) or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.mxnet.io/email/unsubscribe/1cca423612511a194c251ba2b6f4734afb9e2d72072c17244057ba2941945a97).