You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2020/01/11 12:42:49 UTC

[GitHub] [incubator-mxnet] dithyrambe opened a new issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

dithyrambe opened a new issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275
 
 
   ## Description
   `CosineEmbeddingLoss` seems to have issue in its symbol API implementation. Loss works fine until I hybridize my model. Looks like its some reshaping issue.
   
   Below is the stacktrace and the snippet to reproduce.
   
   ```
   Traceback (most recent call last):
     File "tmp.tmp", line 9, in <module>
       loss(x, x, labels)
     File "/opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/gluon/block.py", line 548, in __call__
       out = self.forward(*args)
     File "/opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/gluon/block.py", line 915, in forward
       return self._call_cached_op(x, *args)
     File "/opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/gluon/block.py", line 821, in _call_cached_op
       out = self._cached_op(*cargs)
     File "/opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/_ctypes/ndarray.py", line 150, in __call__
       ctypes.byref(out_stypes)))
     File "/opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/base.py", line 253, in check_call
       raise MXNetError(py_str(_LIB.MXGetLastError()))
   mxnet.base.MXNetError: Error in operator cosineembeddingloss0__div0: [12:59:44] /home/travis/build/dmlc/mxnet-distro/mxnet-build/3rdparty/mshadow/../../src/operator/tensor/../elemwise_op_common.h:135: Check failed: assign(&dattr, vec.at(i)): Incompatible attr in node cosineembeddingloss0__div0 at 1-th input: expected [2], got [1,2]
   Stack trace:
     [bt] (0) /opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x2795cb) [0x7fbbb2fc35cb]
     [bt] (1) /opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x3e8e63) [0x7fbbb3132e63]
     [bt] (2) /opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x3e9755) [0x7fbbb3133755]
     [bt] (3) /opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x9068f1) [0x7fbbb36508f1]
     [bt] (4) /opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x23c8262) [0x7fbbb5112262]
     [bt] (5) /opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x23cab2a) [0x7fbbb5114b2a]
     [bt] (6) /opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x23e6b0f) [0x7fbbb5130b0f]
     [bt] (7) /opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/libmxnet.so(mxnet::CachedOp::CheckDynamicShapeExists(mxnet::Context const&, std::vector<mxnet::NDArray*, std::allocator<mxnet::NDArray*> > const&, bool)+0x3e3) [0x7fbbb5137ad3]
     [bt] (8) /opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet/libmxnet.so(mxnet::CachedOp::Forward(std::shared_ptr<mxnet::CachedOp> const&, std::vector<mxnet::NDArray*, std::allocator<mxnet::NDArray*> > const&, std::vector<mxnet::NDArray*, std::allocator<mxnet::NDArray*> > const&)+0xa5a) [0x7fbbb513e74a]
   ```
   
   ## To Reproduce
   ```python
   import mxnet as mx
   
   x = mx.random.randn(2, 2)
   labels = mx.nd.ones(2)
   loss = mx.gluon.loss.CosineEmbeddingLoss()
   
   print(loss(x, x, labels))  # works fine
   loss.hybridize()
   print(loss(x, x, labels))  # shape issue
   
   ```
   
   ## Attempt to solve
   I manage in to get  around it by explicitly calling `F.reshape` instead of `<tensor>.reshape` in `CosineEmbeddingLoss._cosine_similarity`
   ```python
   import mxnet as mx
   from mxnet import ndarray
   from mxnet.gluon.loss import CosineEmbeddingLoss
   
   class FixedCosineEmbeddingLoss(CosineEmbeddingLoss):
       def _cosine_similarity(self, F, x, y, axis=-1):
           # Calculates the cosine similarity between 2 vectors
           x_norm = F.reshape(F.norm(x, axis=axis), shape=(-1, 1))
           y_norm = F.reshape(F.norm(y, axis=axis), shape=(-1, 1))
           x_dot_y = F.reshape(F.sum(x * y, axis=axis), shape=(-1, 1))
           if F is ndarray:
               eps_arr = F.array([1e-12])
           else:
               eps_arr = F.full((1, 1), 1e-12)
           return (x_dot_y / F.broadcast_maximum(x_norm * y_norm, eps_arr))
   
   
   x = mx.random.randn(2, 2)
   labels = mx.nd.ones(2)
   loss = FixedCosineEmbeddingLoss()
   
   print(loss(x, x, labels))
   loss.hybridize()
   print(loss(x, x, labels))  # works fine now
   ```
   
   ## Environment
   ```
   curl --retry 10 -s https://raw.githubusercontent.com/dmlc/gluon-nlp/master/tools/diagnose.py | python
   
   ----------Python Info----------
   Version      : 3.6.9
   Compiler     : GCC 7.3.0
   Build        : ('default', 'Jul 30 2019 19:07:31')
   Arch         : ('64bit', '')
   ------------Pip Info-----------
   Version      : 19.3.1
   Directory    : /opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/pip
   ----------MXNet Info-----------
   Version      : 1.5.1
   Directory    : /opt/miniconda3/envs/deep-search/lib/python3.6/site-packages/mxnet
   Num GPUs     : 0
   Commit Hash   : c9818480680f84daa6e281a974ab263691302ba8
   ----------System Info----------
   Platform     : Linux-5.0.0-37-generic-x86_64-with-debian-buster-sid
   system       : Linux
   node         : lbctp
   release      : 5.0.0-37-generic
   version      : #40~18.04.1-Ubuntu SMP Thu Nov 14 12:06:39 UTC 2019
   ----------Hardware Info----------
   machine      : x86_64
   processor    : x86_64
   Architecture :                          x86_64
   Mode(s) opératoire(s) des processeurs : 32-bit, 64-bit
   Boutisme :                              Little Endian
   Processeur(s) :                         8
   Liste de processeur(s) en ligne :       0-7
   Thread(s) par cœur :                    2
   Cœur(s) par socket :                    4
   Socket(s) :                             1
   Nœud(s) NUMA :                          1
   Identifiant constructeur :              GenuineIntel
   Famille de processeur :                 6
   Modèle :                                142
   Nom de modèle :                         Intel(R) Core(TM) i7-8550U CPU @ 1.80GHz
   Révision :                              10
   Vitesse du processeur en MHz :          2599.998
   Vitesse maximale du processeur en MHz : 4000,0000
   Vitesse minimale du processeur en MHz : 400,0000
   BogoMIPS :                              3984.00
   Virtualisation :                        VT-x
   Cache L1d :                             32K
   Cache L1i :                             32K
   Cache L2 :                              256K
   Cache L3 :                              8192K
   Nœud NUMA 0 de processeur(s) :          0-7
   Drapaux :                               fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf tsc_known_freq pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm 3dnowprefetch cpuid_fault epb invpcid_single pti ssbd ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid ept_ad fsgsbase tsc_adjust bmi1 avx2 smep bmi2 erms invpcid mpx rdseed adx smap clflushopt intel_pt xsaveopt xsavec xgetbv1 xsaves dtherm ida arat pln pts hwp hwp_notify hwp_act_window hwp_epp md_clear flush_l1d
   ----------Network Test----------
   Setting timeout: 10
   Timing for MXNet: https://github.com/apache/incubator-mxnet, DNS: 0.0096 sec, LOAD: 0.6905 sec.
   Timing for GluonNLP GitHub: https://github.com/dmlc/gluon-nlp, DNS: 0.0007 sec, LOAD: 0.6103 sec.
   Timing for GluonNLP: http://gluon-nlp.mxnet.io, DNS: 0.0799 sec, LOAD: 0.5383 sec.
   Timing for D2L: http://d2l.ai, DNS: 0.0188 sec, LOAD: 0.2801 sec.
   Timing for D2L (zh-cn): http://zh.d2l.ai, DNS: 0.0181 sec, LOAD: 0.3634 sec.
   Timing for FashionMNIST: https://repo.mxnet.io/gluon/dataset/fashion-mnist/train-labels-idx1-ubyte.gz, DNS: 0.0261 sec, LOAD: 0.7186 sec.
   Timing for PYPI: https://pypi.python.org/pypi/pip, DNS: 0.0098 sec, LOAD: 0.3788 sec.
   Timing for Conda: https://repo.continuum.io/pkgs/free/, DNS: 0.0140 sec, LOAD: 0.0724 sec.
   ```
   
   Thanks for having a look. Let me know if I should make a PR with the tiny modification.
   Regards,
   
   Dithyrambe

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] lvpasquier commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
lvpasquier commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-573896943
 
 
   Hello, thanks for having a look.
   
   Actually shape issue seems to occur even before that when `F.norm` is called. 
   
   ```python
   import mxnet as mx
   from mxnet.gluon import HybridBlock
   
   
   class Foo(HybridBlock):
       def hybrid_forward(self, F, x, axis=-1):
           return F.norm(x, axis=axis).reshape(-1, 1)
   
   
   foo = Foo()
   x = mx.random.randn(2, 2)
   
   print(foo(x).shape)  # (2, 1)
   foo.hybridize()
   print(foo(x).shape)  # (2,)
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] lvpasquier edited a comment on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
lvpasquier edited a comment on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-573896943
 
 
   Hello, thanks for having a look.
   
   Actually shape issue seems to occur right after `F.norm` is called and `.reshape` is used. As you can see below.
   
   ```python
   import mxnet as mx
   from mxnet.gluon import HybridBlock
   
   
   class Foo(HybridBlock):
       def hybrid_forward(self, F, x, axis=-1):
           return F.norm(x, axis=axis).reshape(-1, 1)
   
   
   foo = Foo()
   x = mx.random.randn(2, 2)
   
   print(foo(x).shape)  # (2, 1)
   foo.hybridize()
   print(foo(x).shape)  # (2,)
   ```
   
   Why would `.reshape` & `F.reshape` have different behaviour ?
   
   Regards

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] dithyrambe closed issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
dithyrambe closed issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275
 
 
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] lvpasquier removed a comment on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
lvpasquier removed a comment on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-573896943
 
 
   Hello, thanks for having a look.
   
   Actually shape issue seems to occur right after `F.norm` is called and `.reshape` is used. As you can see below.
   
   ```python
   import mxnet as mx
   from mxnet.gluon import HybridBlock
   
   
   class Foo(HybridBlock):
       def hybrid_forward(self, F, x, axis=-1):
           return F.norm(x, axis=axis).reshape(-1, 1)
   
   
   foo = Foo()
   x = mx.random.randn(2, 2)
   
   print(foo(x).shape)  # (2, 1)
   foo.hybridize()
   print(foo(x).shape)  # (2,)
   ```
   
   Why would `.reshape` & `F.reshape` have different behaviour ?
   
   Regards

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] leezu commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
leezu commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-574306610
 
 
   If you think it's required to change line 925, 926, 927 above to use tuple `(-1, 1)` please create a PR. Is that what you mean? 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] dithyrambe commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
dithyrambe commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-573907459
 
 
   Oh I just noticed something there:
   
   `.reshape(-1, 1)` is not the same as `.reshape((-1, 1))` on symbolic API. 
   
   Does this should be fixed ? Or is it a desired behaviour ?
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] lvpasquier edited a comment on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
lvpasquier edited a comment on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-573896943
 
 
   Hello, thanks for having a look.
   
   Actually shape issue seems to occur right after `F.norm` is called and `.reshape` is used. As you can see below.
   
   ```python
   import mxnet as mx
   from mxnet.gluon import HybridBlock
   
   
   class Foo(HybridBlock):
       def hybrid_forward(self, F, x, axis=-1):
           return F.norm(x, axis=axis).reshape(-1, 1)
   
   
   foo = Foo()
   x = mx.random.randn(2, 2)
   
   print(foo(x).shape)  # (2, 1)
   foo.hybridize()
   print(foo(x).shape)  # (2,)
   ```
   
   Regards

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] dithyrambe commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
dithyrambe commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-574267259
 
 
   Nice,
   
   I guess the issue is solved then.
   What about the fix ?
   
   Regards

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] dithyrambe commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
dithyrambe commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-574348641
 
 
   All right, 
   
   Thanks for your help.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] dithyrambe removed a comment on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
dithyrambe removed a comment on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-573907459
 
 
   Oh I just noticed something there:
   
   `.reshape(-1, 1)` is not the same as `.reshape((-1, 1))` on symbolic API. 
   
   Does this should be fixed ? Or is it a desired behaviour ?
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] dithyrambe commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
dithyrambe commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-573903431
 
 
   Hello, thanks for having a look.
   
   Actually shape issue seems to occur right after `F.norm` is called and `.reshape` is used. As you can see below.
   
   ```python
   import mxnet as mx
   from mxnet.gluon import HybridBlock
   
   
   class Foo(HybridBlock):
       def hybrid_forward(self, F, x, axis=-1):
           return F.norm(x, axis=axis).reshape(-1, 1)
   
   
   foo = Foo()
   x = mx.random.randn(2, 2)
   
   print(foo(x).shape)  # (2, 1)
   foo.hybridize()
   print(foo(x).shape)  # (2,)
   ```
   
   Why would `.reshape` & `F.reshape` have different behaviour ?
   
   Regards

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] leezu commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
leezu commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-574226156
 
 
   There are many subtle inconsistencies with symbolic and ndarray. Those will be fixed in MXNet 2 by migrating to a numpy compatible interface.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] dithyrambe edited a comment on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
dithyrambe edited a comment on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-573903431
 
 
   Hello, thanks for having a look.
   
   Actually shape issue seems to occur right after `F.norm` is called and `.reshape` is used. As you can see below.
   
   ```python
   import mxnet as mx
   from mxnet.gluon import HybridBlock
   
   
   class Foo(HybridBlock):
       def hybrid_forward(self, F, x, axis=-1):
           return F.norm(x, axis=axis).reshape(-1, 1)
   
   
   foo = Foo()
   x = mx.random.randn(2, 2)
   
   print(foo(x).shape)  # (2, 1)
   foo.hybridize()
   print(foo(x).shape)  # (2,)
   ```
   
   Why would `.reshape` & `F.reshape` have different behaviour with symbolic api ?
   
   Regards

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] leezu commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
leezu commented on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-573624439
 
 
   Would it be required to use `broadcast_div` instead of `/` in the last line below:
   
   https://github.com/apache/incubator-mxnet/blob/985a4ca62766ddca22b995a663d258003602f6af/python/mxnet/gluon/loss.py#L923-L932

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-mxnet] dithyrambe edited a comment on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)

Posted by GitBox <gi...@apache.org>.
dithyrambe edited a comment on issue #17275: CosineEmbeddingLoss won't work after hybridize (ie. with symbol API)
URL: https://github.com/apache/incubator-mxnet/issues/17275#issuecomment-573903431
 
 
   Hello, thanks for having a look.
   
   Actually, I just realized that shape issue comes from the fact that, in symbolic api: 
   `.reshape(-1, 1)` is not the same as `.reshape((-1, 1))`. 
   
   Obviously, the simplest fix for `CosineEmbeddingLoss` would be to call explicitly `F.reshape` or to call `.reshape((-1, 1))`. 
   
   Though, souldn't `ndarray` and `symbolic` api have the same behaviour regarding `.reshape` ?
   Does this should be fixed ? Is it a desired behaviour ?
   
   Regards

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services