You are viewing a plain text version of this content. The canonical link for it is here.
Posted to discuss-archive@tvm.apache.org by Gopinath R via TVM Discuss <no...@discuss.tvm.ai> on 2020/04/22 12:14:06 UTC

[TVM Discuss] [Questions] Don't know how to handle type while using torch.repeat


I am trying to build tvm relay for the following model [BTS](https://github.com/cogaplex-bts/bts/blob/master/pytorch/bts.py)

while converting i am getting the following error 
```
/Documents/Projects/Kyocera_depth_estimation/optimization/tvm/python/tvm/relay/frontend/pytorch.py in from_pytorch(script_module, input_shapes, custom_convert_map)
   2254 
   2255     ret = convert_operators(_get_operator_nodes(graph.nodes()),
-> 2256                             outputs, ret_name, convert_map, prelude)
   2257 
   2258     mod["main"] = tvm.relay.Function(_analysis.free_vars(ret[0]), ret[0])

~/Documents/Projects/Kyocera_depth_estimation/optimization/tvm/python/tvm/relay/frontend/pytorch.py in convert_operators(operators, outputs, ret_names, convert_map, prelude)
   2168         else:
   2169             relay_op = convert_map[operator]
-> 2170             relay_out = relay_op(inputs, _get_input_types(op_node))
   2171 
   2172             if isinstance(relay_out, tuple):

~/Documents/Projects/Kyocera_depth_estimation/optimization/tvm/python/tvm/relay/frontend/pytorch.py in _impl(inputs, input_types)
    324         print("data>>>",data)
    325         print("reps>>>",reps)
--> 326         return _op.transform.tile(data, reps=reps)
    327     return _impl
    328 

~/Documents/Projects/Kyocera_depth_estimation/optimization/tvm/python/tvm/relay/op/transform.py in tile(data, reps)
    446     """
    447 
--> 448     return _make.tile(data, reps)
    449 
    450 

~/Documents/Projects/Kyocera_depth_estimation/optimization/tvm/python/tvm/_ffi/_ctypes/packed_func.py in __call__(self, *args)
    208         """
    209         temp_args = []
--> 210         values, tcodes, num_args = _make_tvm_args(args, temp_args)
    211         ret_val = TVMValue()
    212         ret_tcode = ctypes.c_int()

~/Documents/Projects/Kyocera_depth_estimation/optimization/tvm/python/tvm/_ffi/_ctypes/packed_func.py in _make_tvm_args(args, temp_args)
    174             temp_args.append(arg)
    175         else:
--> 176             raise TypeError("Don't know how to handle type %s" % type(arg))
    177     return values, type_codes, num_args
    178 

**TypeError: Don't know how to handle type <class 'torch.Tensor'>**

```

I have already raised an issue which is as follows

[Support for torch.repeat](https://github.com/apache/incubator-tvm/issues/5133#issuecomment-617591097) ,

As suggested in the above forum *Currently tvm-pytorch frontend doesnt support taking each value of `reps` as another tensor or function. It is expecting as an simple constant and in this case we are getting a multiplication operator*, 
i tried changing the lines in local_planar_guidance method in bts.py from 

```
    u = self.u.repeat(plane_eq.size(0), plane_eq.size(2) * int(self.upratio), plane_eq.size(3))
    v = self.v.repeat(plane_eq.size(0), plane_eq.size(2), plane_eq.size(3) * int(self.upratio))
```

to

```
    u = self.u.repeat(plane_eq.size(0), int(plane_eq.size(2) * int(self.upratio)), plane_eq.size(3))
    v = self.v.repeat(plane_eq.size(0), plane_eq.size(2), int(plane_eq.size(3) * int(self.upratio)))
```

but still the error remains same and additionally i am getting the following warning
```
/home/gopinathr/Documents/Projects/Kyocera_depth_estimation/optimization/bts_tvm/bts_inference_code/bts.py:151: TracerWarning: Converting a tensor to a Python integer might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  u = self.u.repeat(plane_eq.size(0), int(plane_eq.size(2) * int(self.upratio)), plane_eq.size(3))
/home/gopinathr/Documents/Projects/Kyocera_depth_estimation/optimization/bts_tvm/bts_inference_code/bts.py:156: TracerWarning: Converting a tensor to a Python integer might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  v = self.v.repeat(plane_eq.size(0), plane_eq.size(2), int(plane_eq.size(3) * int(self.upratio))) 

```

I found that this is a fundamental limitation of tracing: You can't trace arbitrary Python objects, including numbers, and they will be constant to the traced functions. So the typical workaround is to trace a wrapper for repeat 

so i added a wrapper for repeat as follows
```
@torch.jit.script
def repeat_u(x, plane_eq, upratio:int):
    return x.repeat(plane_eq.size(0), int(plane_eq.size(2)) * upratio, plane_eq.size(3))

@torch.jit.script
def repeat_v(y, plane_eq, upratio:int):
    return y.repeat(plane_eq.size(0), plane_eq.size(2), int(plane_eq.size(3)) * upratio)
```
and changed the linesin local_planar_guidance module from
```
    u = self.u.repeat(plane_eq.size(0), plane_eq.size(2) * int(self.upratio), plane_eq.size(3))
    v = self.v.repeat(plane_eq.size(0), plane_eq.size(2), plane_eq.size(3) * int(self.upratio))
```
to

```
    u = repeat_u(self.u, plane_eq, int(self.upratio))
    v = repeat_v(self.v, plane_eq, int(self.upratio))
```

this resolved the warning while tracing but still i am getting **TypeError: Don't know how to handle type <class 'torch.Tensor'>**

before usinga torch.jit.script wrapper for tracing following are the values of input and input_types for repeat are

input>>> [tensor([[[0., 1., 2., 3., 4., 5., 6., 7.]]]), Var(1739, ty=TensorType([1, 1216, 44], float32))]
input_types>>> ['float', 'ListType']
data>>> tensor([[[0., 1., 2., 3., 4., 5., 6., 7.]]])
reps>>> (1, 1216, 44)

values of input and input_types for repeat when i use torch.jit.script wrapper are

input>>> [tensor([[[0., 1., 2., 3., 4., 5., 6., 7.]]]), [1, CallNode(Op(multiply), [Constant(152), Constant(8)], (nullptr), []), 44]]
input_types>>> ['float', 'ListType']
data>>> tensor([[[0., 1., 2., 3., 4., 5., 6., 7.]]])
reps>>> [1, CallNode(Op(multiply), [Constant(152), Constant(8)], (nullptr), []), 44]

If anyoneknow what is the reason and how to resolve this it would be really helpful





---
[Visit Topic](https://discuss.tvm.ai/t/dont-know-how-to-handle-type-class-torch-tensor-while-using-torch-repeat/6460/1) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/77af8d359d458a16bec981c2ae53f70ce117232aad71896d17b761ef83c4bbf5).

[TVM Discuss] [Questions] Don't know how to handle type while using torch.repeat

Posted by Siju via TVM Discuss <no...@discuss.tvm.ai>.

Along with the above mentioned changes, apply the patch from PR https://github.com/apache/incubator-tvm/pull/5383 as well.

This PR is not merged yet. Once merged, you can pull the latest code and it will work.
Im able to make relay build.

Note: No need to use wrapper, other warning have no impact since its scalar multiplication. with wrapper, again same issue may happen.





---
[Visit Topic](https://discuss.tvm.ai/t/dont-know-how-to-handle-type-class-torch-tensor-while-using-torch-repeat/6460/2) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/72544b684853bc456e03e450343505e3442979e0202be2f0dbd893fb5e79c0ba).

[TVM Discuss] [Questions] Don't know how to handle type while using torch.repeat

Posted by Gopinath R via TVM Discuss <no...@discuss.tvm.ai>.

@siju-samuel thank you so much for the help. I am able to build the tvm relay IR but i found that outputs are wrong and i am trying to find out why. Anyway thanks for the help.





---
[Visit Topic](https://discuss.tvm.ai/t/dont-know-how-to-handle-type-class-torch-tensor-while-using-torch-repeat/6460/3) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/c24f9cab597d292b818305db44da121db4288917229de63ab9d2ed674c6f5ea4).