You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2020/12/17 19:45:25 UTC

[GitHub] [tvm] codeislife99 opened a new pull request #7125: Sparse reshape op

codeislife99 opened a new pull request #7125:
URL: https://github.com/apache/tvm/pull/7125


   Thanks for contributing to TVM!   Please refer to guideline https://tvm.apache.org/docs/contribute/ for useful information and tips. After the pull request is submitted, please request code reviews from [Reviewers](https://github.com/apache/incubator-tvm/blob/master/CONTRIBUTORS.md#reviewers) by @ them in the pull request thread.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 commented on pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 commented on pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#issuecomment-782473949


   I have a completely new implementation which addresses the many issues in this PR regarding performance and dynamic shapes in #7477 . I am closing this PR. Please review the new PR [here](#7477)


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] tkonolige commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
tkonolige commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r547393612



##########
File path: python/tvm/relay/op/transform.py
##########
@@ -1320,3 +1320,52 @@ def adv_index(inputs):
         Output tensor.
     """
     return _make.adv_index(Tuple(inputs))
+
+
+def sparsereshape(sparse_indices, sparse_values, prev_shape, new_shape):
+    """
+    Reshape a Sparse Tensor

Review comment:
       The convention is the same as `sparse_to_dense`. However `sparse_dense` uses CSR and BSR formats. We should probably add documentation to `sparse_to_dense` too.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] mbrookhart commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
mbrookhart commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r551422829



##########
File path: include/tvm/topi/transform.h
##########
@@ -1386,6 +1386,88 @@ inline Array<Tensor> meshgrid(const Array<Tensor>& inputs, const std::string& in
   return result;
 }
 
+/*!
+ * \brief Compute new sparse indices and return them after the sparse_reshape operation
+ *
+ * \param sparse_indices Indices where values of the dense tensor exist
+ * \param prev_shape Old Shape of the sparse tensor corresponding to sparse_indices
+ * \param new_shape Desired Shape of the sparse tensor which will correspond to output
+ * \param name The name of the operation
+ * \param tag The tag to mark the operation
+ *
+ * \return A Tensor whose op member is the sparse_reshape operation
+ */
+inline Array<Tensor> SparseReshape(const Tensor& sparse_indices, const Tensor& prev_shape,
+                                   const Tensor& new_shape,
+                                   const std::string name = "T_sparse_reshape",
+                                   std::string tag = kInjective) {
+  Array<Tensor> result;
+  Array<PrimExpr> new_sparse_indices_shape{sparse_indices->shape[0], new_shape->shape[0]};
+
+  int new_shape_size = GetConstInt(new_shape->shape[0]);
+  int prev_shape_size = GetConstInt(prev_shape->shape[0]);

Review comment:
       My main complaint is that this will fail with dynamic input shapes. From what I understand, you expect multiple chained dynamically-shaped sparse ops in the model you're trying to target, so I'm hesitant to merge this because I'm under the impression that this will not solve the larger problem you're trying to solve. 
   
   I'd really like to see you either test the model in a branch containing all three of your PRs, or write a unit test with a representative subgraph.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r546652775



##########
File path: python/tvm/relay/op/transform.py
##########
@@ -1320,3 +1320,47 @@ def adv_index(inputs):
         Output tensor.
     """
     return _make.adv_index(Tuple(inputs))
+
+
+def sparsereshape(sparse_indices, sparse_values, prev_shape, new_shape):
+    """
+    Reshape a Sparse Tensor
+
+    Parameters
+    ----------
+    inputs : List[relay.Expr]
+        Input tensor and indices.
+        The first tensor is input data and rests are indices.
+
+    Returns
+    -------
+    result: relay.Expr
+        Output tensor.
+    Examples
+        --------
+        .. code-block:: python
+
+            sparse_indices = [[0, 0, 0],
+                              [0, 0, 1],
+                              [0, 1, 0],
+                              [1, 0, 0],
+                              [1, 2, 3]]
+
+            sparse_values = [7, 5, 6, 3, 9]
+
+            prev_shape = [2, 3, 4]
+
+            new_shape = [9, -1]
+
+            relay.sparsereshape(sparse_indices,
+                                sparse_values,
+                                prev_shape,
+                                new_shape)
+             =   [[0, 0],
+                  [0, 1],
+                  [1, 2],
+                  [4, 2],
+                  [8, 1]]
+

Review comment:
       Done.

##########
File path: python/tvm/topi/transform.py
##########
@@ -931,3 +931,47 @@ def adv_index(data, indices):
         Output tensor
     """
     return cpp.adv_index(data, indices)
+
+
+def sparsereshape(sparse_indices, sparse_values, prev_shape, new_shape):
+    """
+    Reshape a Sparse Tensor
+
+    Parameters
+    ----------
+    inputs : List[relay.Expr]
+        Input tensor and indices.
+        The first tensor is input data and rests are indices.
+
+    Returns
+    -------
+    result: relay.Expr
+        Output tensor.
+    Examples
+        --------
+        .. code-block:: python
+
+            sparse_indices = [[0, 0, 0],
+                              [0, 0, 1],
+                              [0, 1, 0],
+                              [1, 0, 0],
+                              [1, 2, 3]]
+
+            sparse_values = [7, 5, 6, 3, 9]
+
+            prev_shape = [2, 3, 4]
+
+            new_shape = [9, -1]
+
+            relay.sparsereshape(sparse_indices,
+                                sparse_values,
+                                prev_shape,
+                                new_shape)
+             =   [[0, 0],
+                  [0, 1],
+                  [1, 2],
+                  [4, 2],
+                  [8, 1]]
+

Review comment:
       Done.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 commented on pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 commented on pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#issuecomment-752427169


   Thats understandable ofcourse, I can wait for the full review. In the meantime I will create a branch with all the three Ops and E2E testing. 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r546677613



##########
File path: include/tvm/topi/transform.h
##########
@@ -1386,6 +1386,85 @@ inline Array<Tensor> meshgrid(const Array<Tensor>& inputs, const std::string& in
   return result;
 }
 
+/*!
+ * \brief Compute new sparse indices and return them after the sparsereshape operation
+ *
+ * \param sparse_indices Indices where values of the dense tensor exist
+ * \param sparse_values Values at the above indices respectively
+ * \param prev_shape Old Shape of the sparse tensor corresponding to sparse_indices
+ * \param new_shape Desired Shape of the sparse tensor which will correspond to output
+ * \param name The name of the operation
+ * \param tag The tag to mark the operation
+ *
+ * \return A Tensor whose op member is the sparsereshape operation
+ */
+
+inline Array<Tensor> SparseReshape(const Tensor& sparse_indices, const Tensor& sparse_values,
+                                   const Tensor& prev_shape, const Tensor& new_shape,
+                                   const std::string name = "T_sparsereshape",
+                                   std::string tag = kInjective) {
+  Array<Tensor> result;
+  Array<PrimExpr> new_sparse_indices_shape{sparse_indices->shape[0], new_shape->shape[0]};
+  std::vector<PrimExpr> multipliers(GetConstInt(prev_shape->shape[0]), 1);
+  std::vector<PrimExpr> dividers(GetConstInt(new_shape->shape[0]), 1);
+
+  tvm::te::compute(Array<PrimExpr>{1}, [&](const Array<Var>& indices) {

Review comment:
       Cleaned this up as per offline discussion.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r546652731



##########
File path: python/tvm/topi/transform.py
##########
@@ -931,3 +931,47 @@ def adv_index(data, indices):
         Output tensor
     """
     return cpp.adv_index(data, indices)
+
+
+def sparsereshape(sparse_indices, sparse_values, prev_shape, new_shape):
+    """
+    Reshape a Sparse Tensor
+
+    Parameters
+    ----------
+    inputs : List[relay.Expr]
+        Input tensor and indices.
+        The first tensor is input data and rests are indices.
+
+    Returns
+    -------
+    result: relay.Expr
+        Output tensor.
+    Examples
+        --------

Review comment:
       Done.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] tkonolige commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
tkonolige commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r546831211



##########
File path: python/tvm/relay/op/_transform.py
##########
@@ -63,6 +63,8 @@
 _reg.register_injective_schedule("sparse_to_dense")
 _reg.register_injective_schedule("matrix_set_diag")
 _reg.register_injective_schedule("adv_index")
+_reg.register_injective_schedule("sparsereshape")

Review comment:
       I think `sparse_reshape` would be a more appropriate name given the above naming conventions.

##########
File path: python/tvm/relay/op/transform.py
##########
@@ -1320,3 +1320,52 @@ def adv_index(inputs):
         Output tensor.
     """
     return _make.adv_index(Tuple(inputs))
+
+
+def sparsereshape(sparse_indices, sparse_values, prev_shape, new_shape):
+    """
+    Reshape a Sparse Tensor

Review comment:
       Could you note that this function only support tensors in COO format, not CSR. In other parts of the codebase, we tend to use CSR.

##########
File path: src/relay/op/tensor/transform.cc
##########
@@ -1553,6 +1553,52 @@ RELAY_REGISTER_OP("meshgrid")
     .set_attr<FTVMCompute>("FTVMCompute", MeshgridCompute)
     .set_attr<TOpPattern>("TOpPattern", kInjective);
 
+TVM_REGISTER_NODE_TYPE(SparseReshapeAttrs);
+
+bool SparseReshapeRel(const Array<Type>& types, int num_inputs, const Attrs& attrs,
+                      const TypeReporter& reporter) {
+  // types: [sparse_indices, sparse_values, result]
+  // ICHECK_EQ(types.size(), 3);
+  auto sparse_indices = types[0].as<TensorTypeNode>();
+  const auto* param = attrs.as<SparseReshapeAttrs>();
+  CHECK(param != nullptr);
+  Array<PrimExpr> new_sparse_indices_shape{sparse_indices->shape[0],
+                                           static_cast<int>((param->new_shape).size())};
+  reporter->Assign(types[2], TensorType(new_sparse_indices_shape, sparse_indices->dtype));
+  return true;
+}
+
+Array<te::Tensor> SparseReshapeCompute(const Attrs& attrs, const Array<te::Tensor>& inputs,
+                                       const Type& out_type) {
+  // ICHECK_EQ(inputs.size(), 2);
+  const auto* param = attrs.as<SparseReshapeAttrs>();
+  CHECK(param != nullptr);
+  return {topi::SparseReshape(inputs[0], inputs[1], param->prev_shape, param->new_shape)};
+}
+
+Expr MakeSparseReshape(Expr sparse_indices, Expr sparse_values, Array<Integer> prev_shape,
+                       Array<Integer> new_shape) {
+  auto attrs = make_object<SparseReshapeAttrs>();
+  attrs->prev_shape = std::move(prev_shape);
+  attrs->new_shape = std::move(new_shape);
+  static const Op& op = Op::Get("sparsereshape");
+  return Call(op, {sparse_indices, sparse_values}, Attrs(attrs), {});
+}
+
+TVM_REGISTER_GLOBAL("relay.op._make.sparsereshape").set_body_typed(MakeSparseReshape);
+
+RELAY_REGISTER_OP("sparsereshape")
+    .describe(R"code(Return twice of normal addition of two tensors.

Review comment:
       Update the description

##########
File path: src/relay/op/tensor/transform.cc
##########
@@ -1553,6 +1553,52 @@ RELAY_REGISTER_OP("meshgrid")
     .set_attr<FTVMCompute>("FTVMCompute", MeshgridCompute)
     .set_attr<TOpPattern>("TOpPattern", kInjective);
 
+TVM_REGISTER_NODE_TYPE(SparseReshapeAttrs);
+
+bool SparseReshapeRel(const Array<Type>& types, int num_inputs, const Attrs& attrs,
+                      const TypeReporter& reporter) {
+  // types: [sparse_indices, sparse_values, result]
+  // ICHECK_EQ(types.size(), 3);

Review comment:
       uncomment or remove

##########
File path: src/relay/op/tensor/transform.cc
##########
@@ -1553,6 +1553,52 @@ RELAY_REGISTER_OP("meshgrid")
     .set_attr<FTVMCompute>("FTVMCompute", MeshgridCompute)
     .set_attr<TOpPattern>("TOpPattern", kInjective);
 
+TVM_REGISTER_NODE_TYPE(SparseReshapeAttrs);
+
+bool SparseReshapeRel(const Array<Type>& types, int num_inputs, const Attrs& attrs,
+                      const TypeReporter& reporter) {
+  // types: [sparse_indices, sparse_values, result]
+  // ICHECK_EQ(types.size(), 3);
+  auto sparse_indices = types[0].as<TensorTypeNode>();
+  const auto* param = attrs.as<SparseReshapeAttrs>();
+  CHECK(param != nullptr);

Review comment:
       ```suggestion
     ICHECK(param != nullptr);
   ```

##########
File path: src/relay/op/tensor/transform.cc
##########
@@ -1553,6 +1553,52 @@ RELAY_REGISTER_OP("meshgrid")
     .set_attr<FTVMCompute>("FTVMCompute", MeshgridCompute)
     .set_attr<TOpPattern>("TOpPattern", kInjective);
 
+TVM_REGISTER_NODE_TYPE(SparseReshapeAttrs);
+
+bool SparseReshapeRel(const Array<Type>& types, int num_inputs, const Attrs& attrs,
+                      const TypeReporter& reporter) {
+  // types: [sparse_indices, sparse_values, result]
+  // ICHECK_EQ(types.size(), 3);
+  auto sparse_indices = types[0].as<TensorTypeNode>();
+  const auto* param = attrs.as<SparseReshapeAttrs>();
+  CHECK(param != nullptr);
+  Array<PrimExpr> new_sparse_indices_shape{sparse_indices->shape[0],
+                                           static_cast<int>((param->new_shape).size())};
+  reporter->Assign(types[2], TensorType(new_sparse_indices_shape, sparse_indices->dtype));
+  return true;
+}
+
+Array<te::Tensor> SparseReshapeCompute(const Attrs& attrs, const Array<te::Tensor>& inputs,
+                                       const Type& out_type) {
+  // ICHECK_EQ(inputs.size(), 2);

Review comment:
       uncomment or remove. If you keep it, it would be great to have an error message too.

##########
File path: src/relay/op/tensor/transform.cc
##########
@@ -1553,6 +1553,52 @@ RELAY_REGISTER_OP("meshgrid")
     .set_attr<FTVMCompute>("FTVMCompute", MeshgridCompute)
     .set_attr<TOpPattern>("TOpPattern", kInjective);
 
+TVM_REGISTER_NODE_TYPE(SparseReshapeAttrs);
+
+bool SparseReshapeRel(const Array<Type>& types, int num_inputs, const Attrs& attrs,
+                      const TypeReporter& reporter) {
+  // types: [sparse_indices, sparse_values, result]
+  // ICHECK_EQ(types.size(), 3);
+  auto sparse_indices = types[0].as<TensorTypeNode>();
+  const auto* param = attrs.as<SparseReshapeAttrs>();
+  CHECK(param != nullptr);
+  Array<PrimExpr> new_sparse_indices_shape{sparse_indices->shape[0],
+                                           static_cast<int>((param->new_shape).size())};
+  reporter->Assign(types[2], TensorType(new_sparse_indices_shape, sparse_indices->dtype));
+  return true;
+}
+
+Array<te::Tensor> SparseReshapeCompute(const Attrs& attrs, const Array<te::Tensor>& inputs,
+                                       const Type& out_type) {
+  // ICHECK_EQ(inputs.size(), 2);
+  const auto* param = attrs.as<SparseReshapeAttrs>();
+  CHECK(param != nullptr);

Review comment:
       ```suggestion
     ICHECK(param != nullptr);
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r547086641



##########
File path: src/relay/op/tensor/transform.cc
##########
@@ -1553,6 +1553,52 @@ RELAY_REGISTER_OP("meshgrid")
     .set_attr<FTVMCompute>("FTVMCompute", MeshgridCompute)
     .set_attr<TOpPattern>("TOpPattern", kInjective);
 
+TVM_REGISTER_NODE_TYPE(SparseReshapeAttrs);
+
+bool SparseReshapeRel(const Array<Type>& types, int num_inputs, const Attrs& attrs,
+                      const TypeReporter& reporter) {
+  // types: [sparse_indices, sparse_values, result]
+  // ICHECK_EQ(types.size(), 3);

Review comment:
       Done.

##########
File path: src/relay/op/tensor/transform.cc
##########
@@ -1553,6 +1553,52 @@ RELAY_REGISTER_OP("meshgrid")
     .set_attr<FTVMCompute>("FTVMCompute", MeshgridCompute)
     .set_attr<TOpPattern>("TOpPattern", kInjective);
 
+TVM_REGISTER_NODE_TYPE(SparseReshapeAttrs);
+
+bool SparseReshapeRel(const Array<Type>& types, int num_inputs, const Attrs& attrs,
+                      const TypeReporter& reporter) {
+  // types: [sparse_indices, sparse_values, result]
+  // ICHECK_EQ(types.size(), 3);
+  auto sparse_indices = types[0].as<TensorTypeNode>();
+  const auto* param = attrs.as<SparseReshapeAttrs>();
+  CHECK(param != nullptr);

Review comment:
       Done.

##########
File path: src/relay/op/tensor/transform.cc
##########
@@ -1553,6 +1553,52 @@ RELAY_REGISTER_OP("meshgrid")
     .set_attr<FTVMCompute>("FTVMCompute", MeshgridCompute)
     .set_attr<TOpPattern>("TOpPattern", kInjective);
 
+TVM_REGISTER_NODE_TYPE(SparseReshapeAttrs);
+
+bool SparseReshapeRel(const Array<Type>& types, int num_inputs, const Attrs& attrs,
+                      const TypeReporter& reporter) {
+  // types: [sparse_indices, sparse_values, result]
+  // ICHECK_EQ(types.size(), 3);
+  auto sparse_indices = types[0].as<TensorTypeNode>();
+  const auto* param = attrs.as<SparseReshapeAttrs>();
+  CHECK(param != nullptr);
+  Array<PrimExpr> new_sparse_indices_shape{sparse_indices->shape[0],
+                                           static_cast<int>((param->new_shape).size())};
+  reporter->Assign(types[2], TensorType(new_sparse_indices_shape, sparse_indices->dtype));
+  return true;
+}
+
+Array<te::Tensor> SparseReshapeCompute(const Attrs& attrs, const Array<te::Tensor>& inputs,
+                                       const Type& out_type) {
+  // ICHECK_EQ(inputs.size(), 2);

Review comment:
       Done.

##########
File path: src/relay/op/tensor/transform.cc
##########
@@ -1553,6 +1553,52 @@ RELAY_REGISTER_OP("meshgrid")
     .set_attr<FTVMCompute>("FTVMCompute", MeshgridCompute)
     .set_attr<TOpPattern>("TOpPattern", kInjective);
 
+TVM_REGISTER_NODE_TYPE(SparseReshapeAttrs);
+
+bool SparseReshapeRel(const Array<Type>& types, int num_inputs, const Attrs& attrs,
+                      const TypeReporter& reporter) {
+  // types: [sparse_indices, sparse_values, result]
+  // ICHECK_EQ(types.size(), 3);
+  auto sparse_indices = types[0].as<TensorTypeNode>();
+  const auto* param = attrs.as<SparseReshapeAttrs>();
+  CHECK(param != nullptr);
+  Array<PrimExpr> new_sparse_indices_shape{sparse_indices->shape[0],
+                                           static_cast<int>((param->new_shape).size())};
+  reporter->Assign(types[2], TensorType(new_sparse_indices_shape, sparse_indices->dtype));
+  return true;
+}
+
+Array<te::Tensor> SparseReshapeCompute(const Attrs& attrs, const Array<te::Tensor>& inputs,
+                                       const Type& out_type) {
+  // ICHECK_EQ(inputs.size(), 2);
+  const auto* param = attrs.as<SparseReshapeAttrs>();
+  CHECK(param != nullptr);

Review comment:
       Done.

##########
File path: src/relay/op/tensor/transform.cc
##########
@@ -1553,6 +1553,52 @@ RELAY_REGISTER_OP("meshgrid")
     .set_attr<FTVMCompute>("FTVMCompute", MeshgridCompute)
     .set_attr<TOpPattern>("TOpPattern", kInjective);
 
+TVM_REGISTER_NODE_TYPE(SparseReshapeAttrs);
+
+bool SparseReshapeRel(const Array<Type>& types, int num_inputs, const Attrs& attrs,
+                      const TypeReporter& reporter) {
+  // types: [sparse_indices, sparse_values, result]
+  // ICHECK_EQ(types.size(), 3);
+  auto sparse_indices = types[0].as<TensorTypeNode>();
+  const auto* param = attrs.as<SparseReshapeAttrs>();
+  CHECK(param != nullptr);
+  Array<PrimExpr> new_sparse_indices_shape{sparse_indices->shape[0],
+                                           static_cast<int>((param->new_shape).size())};
+  reporter->Assign(types[2], TensorType(new_sparse_indices_shape, sparse_indices->dtype));
+  return true;
+}
+
+Array<te::Tensor> SparseReshapeCompute(const Attrs& attrs, const Array<te::Tensor>& inputs,
+                                       const Type& out_type) {
+  // ICHECK_EQ(inputs.size(), 2);
+  const auto* param = attrs.as<SparseReshapeAttrs>();
+  CHECK(param != nullptr);
+  return {topi::SparseReshape(inputs[0], inputs[1], param->prev_shape, param->new_shape)};
+}
+
+Expr MakeSparseReshape(Expr sparse_indices, Expr sparse_values, Array<Integer> prev_shape,
+                       Array<Integer> new_shape) {
+  auto attrs = make_object<SparseReshapeAttrs>();
+  attrs->prev_shape = std::move(prev_shape);
+  attrs->new_shape = std::move(new_shape);
+  static const Op& op = Op::Get("sparsereshape");
+  return Call(op, {sparse_indices, sparse_values}, Attrs(attrs), {});
+}
+
+TVM_REGISTER_GLOBAL("relay.op._make.sparsereshape").set_body_typed(MakeSparseReshape);
+
+RELAY_REGISTER_OP("sparsereshape")
+    .describe(R"code(Return twice of normal addition of two tensors.

Review comment:
       Done.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] mbrookhart commented on pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
mbrookhart commented on pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#issuecomment-752206801


   I'm out on leave this week, so I won't be giving it a full review until Monday. @codeislife99 , do you have a branch where you tested the TF model with all three ops?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 commented on pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 commented on pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#issuecomment-747790790


   @trevor-m @comaniac @zhiics PTAL 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 commented on pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 commented on pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#issuecomment-751961267


   @tkonolige @mbrookhart I have added the TF Frontend code. Can I get a re-review on this PR following up on our discussion last week ? 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r547086590



##########
File path: python/tvm/relay/op/_transform.py
##########
@@ -63,6 +63,8 @@
 _reg.register_injective_schedule("sparse_to_dense")
 _reg.register_injective_schedule("matrix_set_diag")
 _reg.register_injective_schedule("adv_index")
+_reg.register_injective_schedule("sparsereshape")

Review comment:
       Done.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 closed pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 closed pull request #7125:
URL: https://github.com/apache/tvm/pull/7125


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r546652595



##########
File path: include/tvm/topi/transform.h
##########
@@ -1386,6 +1386,85 @@ inline Array<Tensor> meshgrid(const Array<Tensor>& inputs, const std::string& in
   return result;
 }
 
+/*!
+ * \brief Compute new sparse indices and return them after the sparsereshape operation
+ *
+ * \param sparse_indices Indices where values of the dense tensor exist
+ * \param sparse_values Values at the above indices respectively
+ * \param prev_shape Old Shape of the sparse tensor corresponding to sparse_indices
+ * \param new_shape Desired Shape of the sparse tensor which will correspond to output
+ * \param name The name of the operation
+ * \param tag The tag to mark the operation
+ *
+ * \return A Tensor whose op member is the sparsereshape operation
+ */
+

Review comment:
       Done. 

##########
File path: python/tvm/relay/op/transform.py
##########
@@ -1320,3 +1320,47 @@ def adv_index(inputs):
         Output tensor.
     """
     return _make.adv_index(Tuple(inputs))
+
+
+def sparsereshape(sparse_indices, sparse_values, prev_shape, new_shape):
+    """
+    Reshape a Sparse Tensor
+
+    Parameters
+    ----------
+    inputs : List[relay.Expr]
+        Input tensor and indices.
+        The first tensor is input data and rests are indices.
+
+    Returns
+    -------
+    result: relay.Expr
+        Output tensor.
+    Examples
+        --------

Review comment:
       Done. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] trevor-m commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
trevor-m commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r545520606



##########
File path: include/tvm/topi/transform.h
##########
@@ -1386,6 +1386,85 @@ inline Array<Tensor> meshgrid(const Array<Tensor>& inputs, const std::string& in
   return result;
 }
 
+/*!
+ * \brief Compute new sparse indices and return them after the sparsereshape operation
+ *
+ * \param sparse_indices Indices where values of the dense tensor exist
+ * \param sparse_values Values at the above indices respectively
+ * \param prev_shape Old Shape of the sparse tensor corresponding to sparse_indices
+ * \param new_shape Desired Shape of the sparse tensor which will correspond to output
+ * \param name The name of the operation
+ * \param tag The tag to mark the operation
+ *
+ * \return A Tensor whose op member is the sparsereshape operation
+ */
+
+inline Array<Tensor> SparseReshape(const Tensor& sparse_indices, const Tensor& sparse_values,
+                                   const Tensor& prev_shape, const Tensor& new_shape,
+                                   const std::string name = "T_sparsereshape",
+                                   std::string tag = kInjective) {
+  Array<Tensor> result;
+  Array<PrimExpr> new_sparse_indices_shape{sparse_indices->shape[0], new_shape->shape[0]};
+  std::vector<PrimExpr> multipliers(GetConstInt(prev_shape->shape[0]), 1);
+  std::vector<PrimExpr> dividers(GetConstInt(new_shape->shape[0]), 1);
+
+  tvm::te::compute(Array<PrimExpr>{1}, [&](const Array<Var>& indices) {

Review comment:
       This compute looks weird to me since it isn’t returning anything. I think it’s better to determine multipliers shape, do a compute over that shape to initialize multipliers, rather than a compute over [1] which has internal loops. And same for dividers.
   
   Can someone with more experience here chime in?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] anijain2305 commented on pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
anijain2305 commented on pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#issuecomment-752184180


   Style and organization-wise LGTM. @tkonolige @mbrookhart Can you PTAL?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] comaniac commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
comaniac commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r545511543



##########
File path: include/tvm/topi/transform.h
##########
@@ -1386,6 +1386,85 @@ inline Array<Tensor> meshgrid(const Array<Tensor>& inputs, const std::string& in
   return result;
 }
 
+/*!
+ * \brief Compute new sparse indices and return them after the sparsereshape operation
+ *
+ * \param sparse_indices Indices where values of the dense tensor exist
+ * \param sparse_values Values at the above indices respectively
+ * \param prev_shape Old Shape of the sparse tensor corresponding to sparse_indices
+ * \param new_shape Desired Shape of the sparse tensor which will correspond to output
+ * \param name The name of the operation
+ * \param tag The tag to mark the operation
+ *
+ * \return A Tensor whose op member is the sparsereshape operation
+ */
+

Review comment:
       remove this line.

##########
File path: python/tvm/relay/op/transform.py
##########
@@ -1320,3 +1320,47 @@ def adv_index(inputs):
         Output tensor.
     """
     return _make.adv_index(Tuple(inputs))
+
+
+def sparsereshape(sparse_indices, sparse_values, prev_shape, new_shape):
+    """
+    Reshape a Sparse Tensor
+
+    Parameters
+    ----------
+    inputs : List[relay.Expr]
+        Input tensor and indices.
+        The first tensor is input data and rests are indices.
+
+    Returns
+    -------
+    result: relay.Expr
+        Output tensor.
+    Examples
+        --------
+        .. code-block:: python
+
+            sparse_indices = [[0, 0, 0],
+                              [0, 0, 1],
+                              [0, 1, 0],
+                              [1, 0, 0],
+                              [1, 2, 3]]
+
+            sparse_values = [7, 5, 6, 3, 9]
+
+            prev_shape = [2, 3, 4]
+
+            new_shape = [9, -1]
+
+            relay.sparsereshape(sparse_indices,
+                                sparse_values,
+                                prev_shape,
+                                new_shape)
+             =   [[0, 0],
+                  [0, 1],
+                  [1, 2],
+                  [4, 2],
+                  [8, 1]]
+

Review comment:
       remove this line.

##########
File path: python/tvm/topi/transform.py
##########
@@ -931,3 +931,47 @@ def adv_index(data, indices):
         Output tensor
     """
     return cpp.adv_index(data, indices)
+
+
+def sparsereshape(sparse_indices, sparse_values, prev_shape, new_shape):
+    """
+    Reshape a Sparse Tensor
+
+    Parameters
+    ----------
+    inputs : List[relay.Expr]
+        Input tensor and indices.
+        The first tensor is input data and rests are indices.
+
+    Returns
+    -------
+    result: relay.Expr
+        Output tensor.
+    Examples
+        --------
+        .. code-block:: python
+
+            sparse_indices = [[0, 0, 0],
+                              [0, 0, 1],
+                              [0, 1, 0],
+                              [1, 0, 0],
+                              [1, 2, 3]]
+
+            sparse_values = [7, 5, 6, 3, 9]
+
+            prev_shape = [2, 3, 4]
+
+            new_shape = [9, -1]
+
+            relay.sparsereshape(sparse_indices,
+                                sparse_values,
+                                prev_shape,
+                                new_shape)
+             =   [[0, 0],
+                  [0, 1],
+                  [1, 2],
+                  [4, 2],
+                  [8, 1]]
+

Review comment:
       remove this line.

##########
File path: python/tvm/topi/transform.py
##########
@@ -931,3 +931,47 @@ def adv_index(data, indices):
         Output tensor
     """
     return cpp.adv_index(data, indices)
+
+
+def sparsereshape(sparse_indices, sparse_values, prev_shape, new_shape):
+    """
+    Reshape a Sparse Tensor
+
+    Parameters
+    ----------
+    inputs : List[relay.Expr]
+        Input tensor and indices.
+        The first tensor is input data and rests are indices.
+
+    Returns
+    -------
+    result: relay.Expr
+        Output tensor.
+    Examples
+        --------

Review comment:
       ```suggestion
   
       Examples
       --------
   ```

##########
File path: python/tvm/relay/op/transform.py
##########
@@ -1320,3 +1320,47 @@ def adv_index(inputs):
         Output tensor.
     """
     return _make.adv_index(Tuple(inputs))
+
+
+def sparsereshape(sparse_indices, sparse_values, prev_shape, new_shape):
+    """
+    Reshape a Sparse Tensor
+
+    Parameters
+    ----------
+    inputs : List[relay.Expr]
+        Input tensor and indices.
+        The first tensor is input data and rests are indices.
+
+    Returns
+    -------
+    result: relay.Expr
+        Output tensor.
+    Examples
+        --------

Review comment:
       ```suggestion
   
       Examples
       --------
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r545533821



##########
File path: include/tvm/topi/transform.h
##########
@@ -1386,6 +1386,85 @@ inline Array<Tensor> meshgrid(const Array<Tensor>& inputs, const std::string& in
   return result;
 }
 
+/*!
+ * \brief Compute new sparse indices and return them after the sparsereshape operation
+ *
+ * \param sparse_indices Indices where values of the dense tensor exist
+ * \param sparse_values Values at the above indices respectively
+ * \param prev_shape Old Shape of the sparse tensor corresponding to sparse_indices
+ * \param new_shape Desired Shape of the sparse tensor which will correspond to output
+ * \param name The name of the operation
+ * \param tag The tag to mark the operation
+ *
+ * \return A Tensor whose op member is the sparsereshape operation
+ */
+
+inline Array<Tensor> SparseReshape(const Tensor& sparse_indices, const Tensor& sparse_values,
+                                   const Tensor& prev_shape, const Tensor& new_shape,
+                                   const std::string name = "T_sparsereshape",
+                                   std::string tag = kInjective) {
+  Array<Tensor> result;
+  Array<PrimExpr> new_sparse_indices_shape{sparse_indices->shape[0], new_shape->shape[0]};
+  std::vector<PrimExpr> multipliers(GetConstInt(prev_shape->shape[0]), 1);
+  std::vector<PrimExpr> dividers(GetConstInt(new_shape->shape[0]), 1);
+
+  tvm::te::compute(Array<PrimExpr>{1}, [&](const Array<Var>& indices) {

Review comment:
       Yeah I was thinking of that too, but then since the value of multipliers depends on its later index, it might be a bit non trivial to computer over multipliers. Am I missing something ? I am open to pseudo-code ideas to make it more clean.  




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] codeislife99 commented on a change in pull request #7125: Sparse reshape op

Posted by GitBox <gi...@apache.org>.
codeislife99 commented on a change in pull request #7125:
URL: https://github.com/apache/tvm/pull/7125#discussion_r547087287



##########
File path: python/tvm/relay/op/transform.py
##########
@@ -1320,3 +1320,52 @@ def adv_index(inputs):
         Output tensor.
     """
     return _make.adv_index(Tuple(inputs))
+
+
+def sparsereshape(sparse_indices, sparse_values, prev_shape, new_shape):
+    """
+    Reshape a Sparse Tensor

Review comment:
       Can you explain how this convention is different from the `sparse_to_dense` operator. I could only find that operator as an example of existing representations. 

##########
File path: python/tvm/relay/op/transform.py
##########
@@ -1320,3 +1320,52 @@ def adv_index(inputs):
         Output tensor.
     """
     return _make.adv_index(Tuple(inputs))
+
+
+def sparsereshape(sparse_indices, sparse_values, prev_shape, new_shape):
+    """
+    Reshape a Sparse Tensor

Review comment:
       Can you explain how this convention is different from the `sparse_to_dense` operator. I could only find that operator as an example of existing representations ? 
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org