You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by gi...@git.apache.org on 2017/08/02 07:00:52 UTC

[GitHub] reminisce commented on issue #7082: Sparse Tensor: request for reviews

reminisce commented on issue #7082: Sparse Tensor: request for reviews
URL: https://github.com/apache/incubator-mxnet/pull/7082#issuecomment-319585957
 
 
   @jermainewang Thanks for the paper reference. I can see the mapping between our currently defined enum `kDefaultStorage`, `kRowSparseStorage`, `kCSRStorage` and vectors `[<0, dense>, <1, dense>]`, `[<0, sparse>, <1, dense>]`, `[<0, dense>, <1, sparse>]`. I have a few questions regarding the proposal though:
   
   1. `kRowSparseStorage` mapps to `[<0, sparse>, <1, dense>]` only when the ndarray is 2D, but the enum `kRowSparseStorage` actually means **"sparse in the first dim and dense in the rest of the dims"**. It seems difficult to use a fixed-size vector to represent this generalized case for N-D tensors, where N is only known at runtime.
   
   2. Has this formalization been proven to work for any storage format? My concern is that if we want to add a storage format that cannot be formalized in the vector representation in the future, how do we deal with the exception?
   
   3. It is not clear to me that how [COO](https://www.tensorflow.org/api_docs/python/tf/SparseTensor) format fits in the scheme? It seems that this kind of storage format is not mentioned/considered in the paper.
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services