You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2020/03/11 10:19:13 UTC

[GitHub] [incubator-mxnet] aGiant opened a new issue #17814: mxnet.gluon.data.vision.transforms.Normalize(mean=0.0, std=1.0) tuple issue within hybird_forward()

aGiant opened a new issue #17814: mxnet.gluon.data.vision.transforms.Normalize(mean=0.0, std=1.0) tuple issue within hybird_forward()
URL: https://github.com/apache/incubator-mxnet/issues/17814
 
 
   From official mxnet web: https://mxnet.apache.org/api/python/docs/tutorials/packages/gluon/training/normalization/index.html and https://mxnet.apache.org/api/python/docs/api/gluon/data/vision/transforms/index.html#mxnet.gluon.data.vision.transforms.Normalize, the Normalize() worked very well, but not in hybird_forward()
   Codes
   ```
   from mxnet.gluon.data.vision.transforms import Normalize
   class VAE(gluon.HybridBlock):
       def __init__(self, n_hidden, n_latent, n_layers, n_output, min_vec, max_vec, act_type='relu', **kwargs):
           super(VAE, self).__init__()
           self.soft_zero = 1e-10
           self.normalizer = Normalize(mean=min_vec, std=max_vec)
           self.n_latent = n_latent
           self.batch_size = 0
           self.mu = None
           
           # note to self: requring batch_size in model definition is sad, 
           # not sure how to deal with this otherwise though
           super(VAE, self).__init__(**kwargs)
           # self.use_aux_logits = use_aux_logits
           # self.normalizer = nn.LayerNorm()
           self.trans = nn.HybridLambda(lambda F, x: x)
           with self.name_scope():
               self.encoder = nn.HybridSequential(prefix='encoder_')
               with self.encoder.name_scope():
                   for i in range(n_layers):
                       self.encoder.add(nn.Dense(n_hidden, activation=act_type))
                   self.encoder.add(nn.Dense(n_latent*2, activation=None))
   
               self.decoder = nn.HybridSequential(prefix='decoder_')
               with self.decoder.name_scope():
                   for i in range(n_layers):
                       self.decoder.add(nn.Dense(n_hidden, activation=act_type))
                   self.decoder.add(nn.Dense(n_output, activation='sigmoid'))
       def forward(self,x):
           self.batch_size = x.shape[0]
           return gluon.HybridBlock.forward(self, x)
       
       def hybrid_forward(self, F, x):
           x_normalized = self.normalizer(x)
           h = self.encoder(x_normalized)
           mu_lv = F.split(h, axis=1, num_outputs=2)
           mu = mu_lv[0]
           lv = mu_lv[1]
           self.mu = mu
           # this would work fine only for nd (i.e. non-hybridized block)
           eps = F.random_normal(loc=0, scale=1, shape=(self.batch_size, self.n_latent), ctx=model_ctx)
           z = mu + F.exp(0.5*lv)*eps
           y = self.decoder(z)
           before_sum = 1+lv-mu*mu-F.exp(lv)
           KL = 0.5*F.nansum(before_sum, axis=1)
           first = x_normalized*F.log(y+self.soft_zero)
           second = (1-x_normalized)*F.log(1-y+self.soft_zero)
           total = first + second
           logloss = F.nansum(total, axis=1)
           loss = -logloss-KL
           return loss, y 
   ```
   Test results:
   ```
   MXNetError                         Traceback (most recent call last)
   <ipython-input-40-cbd52c8ac2cb> in <module>
         2 net.collect_params().initialize(mx.init.Xavier(), ctx=model_ctx)
         3 #net(mx.nd.random.uniform(shape=(128,feature_n), ctx=model_ctx))
   ----> 4 print(net.summary(mx.nd.random.uniform(shape=(1,feature_n), ctx=model_ctx)))
         5 net.hybridize()
         6 trainer = gluon.Trainer(net.collect_params(), 'SGD', {'wd':0.01}) #'learning_rate': .001,
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in summary(self, *inputs)
       648         try:
       649             self.apply(_register_summary_hook)
   --> 650             self(*inputs)
       651 
       652             line_format = '{:>20}  {:>42} {:>15}'
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in __call__(self, *args)
       546             hook(self, args)
       547 
   --> 548         out = self.forward(*args)
       549 
       550         for hook in self._forward_hooks.values():
   
   <ipython-input-36-fd3336d944b4> in forward(self, x)
        45     def forward(self,x):
        46         self.batch_size = x.shape[0]
   ---> 47         return gluon.HybridBlock.forward(self, x)
        48 
        49     # https://mxnet.apache.org/api/python/docs/tutorials/extend/custom_layer.html
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in forward(self, x, *args)
       923                     params = {i: j.data(ctx) for i, j in self._reg_params.items()}
       924 
   --> 925                 return self.hybrid_forward(ndarray, x, *args, **params)
       926 
       927         assert isinstance(x, Symbol), \
   
   <ipython-input-36-fd3336d944b4> in hybrid_forward(self, F, x)
        52         x_ = x.reshape((x.shape[0], x.shape[1], 1))
        53         #x_normalized = F.broadcast_div(F.broadcast_sub(self.flatten(x), self.min_v), (F.broadcast_sub(self.max_v, self.min_v)))
   ---> 54         x_normalized = self.normalizer(x_)
        55         h = self.encoder(x_normalized)
        56         #print(h.asnumpy()[0])
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in __call__(self, *args)
       546             hook(self, args)
       547 
   --> 548         out = self.forward(*args)
       549 
       550         for hook in self._forward_hooks.values():
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in forward(self, x, *args)
       923                     params = {i: j.data(ctx) for i, j in self._reg_params.items()}
       924 
   --> 925                 return self.hybrid_forward(ndarray, x, *args, **params)
       926 
       927         assert isinstance(x, Symbol), \
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/data/vision/transforms.py in hybrid_forward(self, F, x)
       188 
       189     def hybrid_forward(self, F, x):
   --> 190         return F.image.normalize(x, self._mean, self._std)
       191 
       192 
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/ndarray/register.py in normalize(data, mean, std, out, name, **kwargs)
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/_ctypes/ndarray.py in _imperative_invoke(handle, ndargs, keys, vals, out)
        90         c_str_array(keys),
        91         c_str_array([str(s) for s in vals]),
   ---> 92         ctypes.byref(out_stypes)))
        93 
        94     if original_output is not None:
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/base.py in check_call(ret)
       251     """
       252     if ret != 0:
   --> 253         raise MXNetError(py_str(_LIB.MXGetLastError()))
       254 
       255 
   
   MXNetError: Invalid Parameter format for std expect tuple of <float> but value='(1.0, 107000000.0, 107000000.0, 74200000.0, 3893.333333, 3735.7368420000003, 4672.0, 5838440.0, 120000000.0, 120000000.0, 120000000.0, 84418013.7826341, 3735.7368420000003, 2896.0, 4113.240146, 2000000.0, 1.0, 65534.0, 156.0, 1.0, 119999998.0, 120000000.0, 120000000.0, 120000000.0, 84800000.0, inf, 4644908.0, 4644908.0, 120000000.0, 120000000.0, 120000000.0, 84602929.2769822, 24820.0, 4672.0, 2065.0, 7125.5968458437, 1.0, 120000000.0, 120000000.0, 76900000.0, 65535.0, 24820.0, 1448.0, 1.0, 3337.142857, 4414.547151258, 19488226.550680302, 1.0, 1.0, 655453030.0, 291922.0, 12870338.0, 291922.0, 219759.0, 655453030.0, 1.0, 213557.0, 138.0, 107000000.0, 120000000.0, 19530.0, 1.0, inf, 120000000.0, 1.0, 3000000.0, 120000000.0, 65535.0, 219759.0, 12900000.0)', in operator _image_normalize(name="", std="(1.0, 107000000.0, 107000000.0, 74200000.0, 3893.333333, 3735.7368420000003, 4672.0, 5838440.0, 120000000.0, 120000000.0, 120000000.0, 84418013.7826341, 3735.7368420000003, 2896.0, 4113.240146, 2000000.0, 1.0, 65534.0, 156.0, 1.0, 119999998.0, 120000000.0, 120000000.0, 120000000.0, 84800000.0, inf, 4644908.0, 4644908.0, 120000000.0, 120000000.0, 120000000.0, 84602929.2769822, 24820.0, 4672.0, 2065.0, 7125.5968458437, 1.0, 120000000.0, 120000000.0, 76900000.0, 65535.0, 24820.0, 1448.0, 1.0, 3337.142857, 4414.547151258, 19488226.550680302, 1.0, 1.0, 655453030.0, 291922.0, 12870338.0, 291922.0, 219759.0, 655453030.0, 1.0, 213557.0, 138.0, 107000000.0, 120000000.0, 19530.0, 1.0, inf, 120000000.0, 1.0, 3000000.0, 120000000.0, 65535.0, 219759.0, 12900000.0)", mean="(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1073741320.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -4.0, -4.0, -4.0, -14.0, 0.0, -2000000.0, -32212234632.0, -32212234632.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, -536870661.0, 0.0, 0.0, 0.0, 0.0, -261000000.0, 0.0, 0.0, 0.0, 0.0, -1.0, 1.0, 0.0)")
   ```
   
   The problem lies:
   ```
   ~/anaconda3/lib/python3.7/site-packages/mxnet/_ctypes/ndarray.py in _imperative_invoke(handle, ndargs, keys, vals, out)
        90         c_str_array(keys),
        91         c_str_array([str(s) for s in vals]),
   ---> 92         ctypes.byref(out_stypes)))
        93 
        94     if original_output is not None:
   ```
   all float values were transformed to "str" and then error was raised. 
   
   Till now gathered issues within hybird_forward():
   1. call self.parameters_nd_array failed
   2. call mx.gluon.Constant failed
   3. call F.broadcast_sub(self.flatten(x), self.min_v) failed
   4. call self.params.get('scales_max',
                                         shape=max_vec.shape,
                                         init=mx.init.Constant(max_vec),
                                         differentiable=False) failed.
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] aGiant commented on issue #17814: mxnet.gluon.data.vision.transforms.Normalize(mean=0.0, std=1.0) tuple issue within hybird_forward()

Posted by GitBox <gi...@apache.org>.
aGiant commented on issue #17814: mxnet.gluon.data.vision.transforms.Normalize(mean=0.0, std=1.0) tuple issue within hybird_forward()
URL: https://github.com/apache/incubator-mxnet/issues/17814#issuecomment-597561951
 
 
   Also tried to reproduce the demo code from link: https://mxnet.apache.org/api/python/docs/tutorials/packages/gluon/blocks/custom_layer_beginners.html, but got error.
   ```
   # Do some initial imports used throughout this tutorial
   from __future__ import print_function
   import mxnet as mx
   from mxnet import nd, gluon, autograd
   from mxnet.gluon.nn import Dense
   mx.random.seed(1)    
   class NormalizationHybridLayer(gluon.HybridBlock):
       def __init__(self, hidden_units, scales):
           super(NormalizationHybridLayer, self).__init__()
   
           with self.name_scope():
               self.weights = self.params.get('weights',
                                              shape=(hidden_units, 0),
                                              allow_deferred_init=True)
   
               self.scales = self.params.get('scales',
                                         shape=scales.shape,
                                         init=mx.init.Constant(scales.asnumpy()),
                                         differentiable=False)
   
       def hybrid_forward(self, F, x, weights, scales):
           normalized_data = F.broadcast_div(F.broadcast_sub(x, F.min(x)), (F.broadcast_sub(F.max(x), F.min(x))))
           weighted_data = F.FullyConnected(normalized_data, weights, num_hidden=self.weights.shape[0], no_bias=True)
           scaled_data = F.broadcast_mul(scales, weighted_data)
           return scaled_data
   
   def print_params(title, net):
       """
       Helper function to print out the state of parameters of NormalizationHybridLayer
       """
       print(title)
       hybridlayer_params = {k: v for k, v in net.collect_params().items() if 'normalizationhybridlayer' in k }
   
       for key, value in hybridlayer_params.items():
           print('{} = {}\n'.format(key, value.data()))
   
   net = gluon.nn.HybridSequential()                             # Define a Neural Network as a sequence of hybrid blocks
   with net.name_scope():                                        # Used to disambiguate saving and loading net parameters
       net.add(Dense(5))                                         # Add Dense layer with 5 neurons
       net.add(NormalizationHybridLayer(hidden_units=5,
                                        scales = nd.array([2]))) # Add our custom layer
       net.add(Dense(1))                                         # Add Dense layer with 1 neurons
   
   
   net.initialize(mx.init.Xavier(magnitude=2.24))                # Initialize parameters of all layers
   net.hybridize()                                               # Create, optimize and cache computational graph
   
   inputs = nd.random_uniform(low=-10, high=10, shape=(5, 2))     # Create 5 random examples with 2 feature each in range [-10, 10]
   label = nd.random_uniform(low=-1, high=1, shape=(5, 1))
   
   mse_loss = gluon.loss.L2Loss()                                # Mean squared error between output and label
   trainer = gluon.Trainer(net.collect_params(),                 # Init trainer with Stochastic Gradient Descent (sgd) optimization method and parameters for it
                           'sgd',
                           {'learning_rate': 0.1, 'momentum': 0.9 })
   
   with autograd.record():                                       # Autograd records computations done on NDArrays inside "with" block
       output = net(inputs)                                       # Run forward propogation
   
       print_params("=========== Parameters after forward pass ===========\n", net)
       loss = mse_loss(output, label)                            # Calculate MSE
   
   loss.backward()                                               # Backward computes gradients and stores them as a separate array within each NDArray in .grad field
   trainer.step(inputs.shape[0])                                  # Trainer updates parameters of every block, using .grad field using oprimization method (sgd in this example)
                                                                 # We provide batch size that is used as a divider in cost function formula
   print_params("=========== Parameters after backward pass ===========\n", net)
   ```
   
   Error raised:
   ```
   TypeError                          Traceback (most recent call last)
   <ipython-input-45-289ed4b30586> in <module>
        29 
        30 with autograd.record():                                       # Autograd records computations done on NDArrays inside "with" block
   ---> 31     output = net(inputs)                                       # Run forward propogation
        32 
        33     print_params("=========== Parameters after forward pass ===========\n", net)
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in __call__(self, *args)
       546             hook(self, args)
       547 
   --> 548         out = self.forward(*args)
       549 
       550         for hook in self._forward_hooks.values():
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in forward(self, x, *args)
       913             with x.context as ctx:
       914                 if self._active:
   --> 915                     return self._call_cached_op(x, *args)
       916 
       917                 try:
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in _call_cached_op(self, *args)
       803     def _call_cached_op(self, *args):
       804         if self._cached_op is None:
   --> 805             self._build_cache(*args)
       806 
       807         args, fmt = _flatten(args, "input")
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in _build_cache(self, *args)
       755 
       756     def _build_cache(self, *args):
   --> 757         data, out = self._get_graph(*args)
       758         data_names = {data.name : i for i, data in enumerate(data)}
       759         params = self.collect_params()
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in _get_graph(self, *args)
       747             params = {i: j.var() for i, j in self._reg_params.items()}
       748             with self.name_scope():
   --> 749                 out = self.hybrid_forward(symbol, *grouped_inputs, **params)  # pylint: disable=no-value-for-parameter
       750             out, self._out_format = _flatten(out, "output")
       751 
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/nn/basic_layers.py in hybrid_forward(self, F, x)
       115     def hybrid_forward(self, F, x):
       116         for block in self._children.values():
   --> 117             x = block(x)
       118         return x
       119 
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in __call__(self, *args)
       546             hook(self, args)
       547 
   --> 548         out = self.forward(*args)
       549 
       550         for hook in self._forward_hooks.values():
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in forward(self, x, *args)
       928             "HybridBlock requires the first argument to forward be either " \
       929             "Symbol or NDArray, but got %s"%type(x)
   --> 930         params = {i: j.var() for i, j in self._reg_params.items()}
       931         with self.name_scope():
       932             return self.hybrid_forward(symbol, x, *args, **params)
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in <dictcomp>(.0)
       928             "HybridBlock requires the first argument to forward be either " \
       929             "Symbol or NDArray, but got %s"%type(x)
   --> 930         params = {i: j.var() for i, j in self._reg_params.items()}
       931         with self.name_scope():
       932             return self.hybrid_forward(symbol, x, *args, **params)
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/parameter.py in var(self)
       602             self._var = symbol.var(self.name, shape=self.shape, dtype=self.dtype,
       603                                    lr_mult=self.lr_mult, wd_mult=self.wd_mult,
   --> 604                                    init=self.init, stype=self._stype)
       605         return self._var
       606 
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/symbol/symbol.py in var(name, attr, shape, lr_mult, wd_mult, dtype, init, stype, **kwargs)
      2649     if init is not None:
      2650         if not isinstance(init, string_types):
   -> 2651             init = init.dumps()
      2652         attr['__init__'] = init
      2653     if stype is not None:
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/initializer.py in dumps(self)
       114         '["xavier", {"rnd_type": "uniform", "magnitude": 2.34, "factor_type": "in"}]'
       115         """
   --> 116         return json.dumps([self.__class__.__name__.lower(), self._kwargs])
       117 
       118     def __call__(self, desc, arr):
   
   ~/anaconda3/lib/python3.7/json/__init__.py in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)
       229         cls is None and indent is None and separators is None and
       230         default is None and not sort_keys and not kw):
   --> 231         return _default_encoder.encode(obj)
       232     if cls is None:
       233         cls = JSONEncoder
   
   ~/anaconda3/lib/python3.7/json/encoder.py in encode(self, o)
       197         # exceptions aren't as detailed.  The list call should be roughly
       198         # equivalent to the PySequence_Fast that ''.join() would do.
   --> 199         chunks = self.iterencode(o, _one_shot=True)
       200         if not isinstance(chunks, (list, tuple)):
       201             chunks = list(chunks)
   
   ~/anaconda3/lib/python3.7/json/encoder.py in iterencode(self, o, _one_shot)
       255                 self.key_separator, self.item_separator, self.sort_keys,
       256                 self.skipkeys, _one_shot)
   --> 257         return _iterencode(o, 0)
       258 
       259 def _make_iterencode(markers, _default, _encoder, _indent, _floatstr,
   
   ~/anaconda3/lib/python3.7/json/encoder.py in default(self, o)
       177 
       178         """
   --> 179         raise TypeError(f'Object of type {o.__class__.__name__} '
       180                         f'is not JSON serializable')
       181 
   
   TypeError: Object of type ndarray is not JSON serializable
   
   ​
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] aGiant commented on issue #17814: mxnet.gluon.data.vision.transforms.Normalize(mean=0.0, std=1.0) tuple issue within hybird_forward()

Posted by GitBox <gi...@apache.org>.
aGiant commented on issue #17814: mxnet.gluon.data.vision.transforms.Normalize(mean=0.0, std=1.0) tuple issue within hybird_forward()
URL: https://github.com/apache/incubator-mxnet/issues/17814#issuecomment-598112586
 
 
   > For the first example, I noticed that you are defining both `hybrid_forward` and `forward` at the same time which is not supposed to work in this way.
   > 
   > For the second example, you can use `get_constant` instead of `get`
   > 
   > ```python
   > # self.scales = self.params.get('scales', shape=scales.shape, init=mx.init.Constant(scales.asnumpy()),  differentiable=False)
   > 
   > self.scales = self.params.get_constant(value=scales)
   > ```
   
   Also tried HybridLambda layer to simplify the normalization, but got some very strange error
   
   ```
   <ipython-input-21-6ebe4f1a4082> in <lambda>(F, x)
        21         # self.normalizer = nn.LayerNorm()
        22         with self.name_scope():
   ---> 23             self.normalizer = nn.HybridLambda(lambda F,x: (x-min_vec)/max_vec) #NormalizationHybridLayer(min_vec, max_vec)
        24             self.encoder = nn.HybridSequential(prefix='encoder_')
        25             with self.encoder.name_scope():
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/symbol/symbol.py in __sub__(self, other)
       142             return _internal._MinusScalar(self, scalar=other)
       143         else:
   --> 144             raise TypeError('type %s not supported' % str(type(other)))
       145 
       146     def __isub__(self, other):
   
   TypeError: type <class 'mxnet.ndarray.ndarray.NDArray'> not supported
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] aGiant commented on issue #17814: mxnet.gluon.data.vision.transforms.Normalize(mean=0.0, std=1.0) tuple issue within hybird_forward()

Posted by GitBox <gi...@apache.org>.
aGiant commented on issue #17814: mxnet.gluon.data.vision.transforms.Normalize(mean=0.0, std=1.0) tuple issue within hybird_forward()
URL: https://github.com/apache/incubator-mxnet/issues/17814#issuecomment-598071179
 
 
   > For the first example, I noticed that you are defining both `hybrid_forward` and `forward` at the same time which is not supposed to work in this way.
   > 
   > For the second example, you can use `get_constant` instead of `get`
   > 
   > ```python
   > # self.scales = self.params.get('scales', shape=scales.shape, init=mx.init.Constant(scales.asnumpy()),  differentiable=False)
   > 
   > self.scales = self.params.get_constant(value=scales)
   > ```
   
   The first one forward() worked very well to get the variable of x.shape. That's the only way to get the input x.shape from other people.
   
   After applying self.params.get_constant("scales", value=scales), got new error
   
   ```
   ---------------------------------------------------------------------------
   RuntimeError                              Traceback (most recent call last)
   <ipython-input-20-a5e4b7303bc4> in <module>
        19         data = batch.data[0].as_in_context(model_ctx)
        20         with autograd.record():
   ---> 21             loss, y = net(data)
        22             #loss, y, kl, ll, f, s, t, v = net(data)
        23 
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in __call__(self, *args)
       691             hook(self, args)
       692 
   --> 693         out = self.forward(*args)
       694 
       695         for hook in self._forward_hooks.values():
   
   <ipython-input-17-f9e660a6116f> in forward(self, x)
        35     def forward(self,x):
        36         self.batch_size = x.shape[0]
   ---> 37         return gluon.HybridBlock.forward(self, x)
        38 
        39     # https://mxnet.apache.org/api/python/docs/tutorials/extend/custom_layer.html
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in forward(self, x, *args)
      1156                     params = {k: v.data(ctx) for k, v in self._reg_params.items()}
      1157 
   -> 1158                 return self.hybrid_forward(ndarray, x, *args, **params)
      1159 
      1160         params = {i: j.var() for i, j in self._reg_params.items()}
   
   <ipython-input-17-f9e660a6116f> in hybrid_forward(self, F, x)
        42         #x_ = x.reshape((x.shape[0], x.shape[1], 1))
        43         #x_normalized = F.broadcast_div(F.broadcast_sub(self.flatten(x), self.min_v), (F.broadcast_sub(self.max_v, self.min_v)))
   ---> 44         x_normalized = self.normalizer(x)
        45         h = self.encoder(x_normalized)
        46         #print(h.asnumpy()[0])
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in __call__(self, *args)
       691             hook(self, args)
       692 
   --> 693         out = self.forward(*args)
       694 
       695         for hook in self._forward_hooks.values():
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in forward(self, x, *args)
      1149             with ctx:
      1150                 try:
   -> 1151                     params = {k: v.data(ctx) for k, v in self._reg_params.items()}
      1152                 except DeferredInitializationError:
      1153                     self._deferred_infer_shape(x, *args)
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/block.py in <dictcomp>(.0)
      1149             with ctx:
      1150                 try:
   -> 1151                     params = {k: v.data(ctx) for k, v in self._reg_params.items()}
      1152                 except DeferredInitializationError:
      1153                     self._deferred_infer_shape(x, *args)
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/parameter.py in data(self, ctx)
       563                                "because its storage type is %s. Please use row_sparse_data() " \
       564                                "instead." % (self.name, str(ctx), self._stype))
   --> 565         return self._check_and_get(self._data, ctx)
       566 
       567     def list_data(self):
   
   ~/anaconda3/lib/python3.7/site-packages/mxnet/gluon/parameter.py in _check_and_get(self, arr_list, ctx)
       240             "with Block.collect_params() instead of Block.params " \
       241             "because the later does not include Parameters of " \
   --> 242             "nested child Blocks"%(self.name))
       243 
       244     def _get_row_sparse(self, arr_list, ctx, row_id):
   
   RuntimeError: Parameter 'vae5_normalizationhybridlayer0_bias' has not been initialized. Note that you should initialize parameters and create Trainer with Block.collect_params() instead of Block.params because the later does not include Parameters of nested child Blocks
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] zhreshold commented on issue #17814: mxnet.gluon.data.vision.transforms.Normalize(mean=0.0, std=1.0) tuple issue within hybird_forward()

Posted by GitBox <gi...@apache.org>.
zhreshold commented on issue #17814: mxnet.gluon.data.vision.transforms.Normalize(mean=0.0, std=1.0) tuple issue within hybird_forward()
URL: https://github.com/apache/incubator-mxnet/issues/17814#issuecomment-597936127
 
 
   For the first example, I noticed that you are defining both `hybrid_forward` and `forward` at the same time which is not supposed to work in this way.
   
   For the second example, you can use `get_constant` instead of `get`
   ```python
   # self.scales = self.params.get('scales', shape=scales.shape, init=mx.init.Constant(scales.asnumpy()),  differentiable=False)
   
   self.scales = self.params.get_constant(value=scales)
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services