You are viewing a plain text version of this content. The canonical link for it is here.
Posted to discuss-archive@tvm.apache.org by Thomas V via TVM Discuss <no...@discuss.tvm.ai> on 2020/06/17 23:51:26 UTC

[TVM Discuss] [Questions] Same shape pattern


Now I'm trying to produce a pattern that matches nodes if they have the same shape.
Is such a pattern available? I only saw has_shape which seems to compare to a fixed shape (which I don't know).
I'm trying to use rewrite and so it seems checking after the matching (an returning an unchanged expression) will lead to an infinite loop.

Best regards

Thomas





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/1) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/be1ab08cf3b1ee0d3e27cf11efe4811ad308547668d1cfaf882dfdd05666356a).

[TVM Discuss] [Questions] Same shape pattern

Posted by Thomas V via TVM Discuss <no...@discuss.tvm.ai>.

The above ZeroZapper code snippet also has the problem.





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/10) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/c93c84638823331843dd4be6c8e6dc6af293d56c2865c0150d9220e607f995a1).

[TVM Discuss] [Questions] Same shape pattern

Posted by Matthew Brookhart via TVM Discuss <no...@discuss.tvm.ai>.

@matt-arm Can you give me an example? Is this just the partition issue we talked about before, or something else?





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/9) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/6a7f5b596cfc1f4000a8819e8bbad52870dfe83dd1db133c21eece564bbd7a02).

[TVM Discuss] [Questions] Same shape pattern

Posted by Thomas V via TVM Discuss <no...@discuss.tvm.ai>.

So with the following rewrites and passes

```python
class ZeroZapp(tvm.relay.dataflow_pattern.DFPatternCallback):
    def __init__(self):
        self.zeros = tvm.relay.dataflow_pattern.is_op("zeros")(tvm.relay.dataflow_pattern.wildcard())
        self.other_tensor = tvm.relay.dataflow_pattern.wildcard()
        self.pattern = (self.zeros + self.other_tensor) | (self.other_tensor + self.zeros)

    def callback(self, pre, post, node_map):
        rt = node_map[self.pattern][0]
        ot = node_map[self.other_tensor][0]
        if (ot._checked_type_ == rt._checked_type_):
            return ot
        else:
            return tvm.relay.broadcast_to(ot, list(rt._checked_type_.shape))

class ZeroZapp(tvm.relay.dataflow_pattern.DFPatternCallback):
    def __init__(self):
        self.ones = tvm.relay.dataflow_pattern.is_op("zeros")(tvm.relay.dataflow_pattern.wildcard()) | tvm.relay.dataflow_pattern.is_constant()
        self.other_tensor = tvm.relay.dataflow_pattern.wildcard()
        self.pattern = (self.ones + self.other_tensor) | (self.other_tensor + self.ones)

    def callback(self, pre, post, node_map):
        rt = node_map[self.pattern][0]
        ones = node_map[self.ones][0]
        ot = node_map[self.other_tensor][0]
        if isinstance(ot, tvm.relay.Constant):
            if not all(ones.data.asnumpy() == 0):
                return rt
        # I don't know why I don't reliably get checked types here...
        if (((rt._checked_type_ is not None) and (ot._checked_type_ == rt._checked_type_))
            or (rt.type_args[0] == rt.type_args[1])):
            return ot
        elif (rt._checked_type_ is not None):
            return tvm.relay.broadcast_to(ot, list(rt._checked_type_.shape))
        return rt

class OneZapp(tvm.relay.dataflow_pattern.DFPatternCallback):
    def __init__(self):
        self.ones = tvm.relay.dataflow_pattern.is_op("ones")(tvm.relay.dataflow_pattern.wildcard()) | tvm.relay.dataflow_pattern.is_constant()
        self.other_tensor = tvm.relay.dataflow_pattern.wildcard()
        self.pattern = (self.ones * self.other_tensor) | (self.other_tensor * self.ones)

    def callback(self, pre, post, node_map):
        rt = node_map[self.pattern][0]
        ones = node_map[self.ones][0]
        ot = node_map[self.other_tensor][0]
        if isinstance(ot, tvm.relay.Constant):
            if not all(ones.data.asnumpy() == 1):
                return rt
        if (ot._checked_type_ == rt._checked_type_):
            return ot
        else:
            return tvm.relay.broadcast_to(ot, list(rt._checked_type_.shape))


class LikeZapp(tvm.relay.dataflow_pattern.DFPatternCallback):
    def __init__(self):
        self.translations_with_dt = {'zeros_like': tvm.relay.zeros,
                                     'ones_like': tvm.relay.ones}
        self.data_tensor = tvm.relay.dataflow_pattern.wildcard()
        self.pattern_tensor = tvm.relay.dataflow_pattern.wildcard()
        self.pattern = ((tvm.relay.dataflow_pattern.is_op("zeros_like")
                        | tvm.relay.dataflow_pattern.is_op("ones_like")
                        )(self.data_tensor)
                        ) | ((
                        tvm.relay.dataflow_pattern.is_op("collapse_sum_like")
                        | tvm.relay.dataflow_pattern.is_op("broadcast_to_like")
                       )(self.data_tensor, self.pattern_tensor))

    def callback(self, pre, post, node_map):
        data = node_map[self.data_tensor][0]
        res = node_map[self.pattern][0]
        if res.op.name in self.translations_with_dt:
            return self.translations_with_dt[res.op.name](list(res._checked_type_.shape),
                                                          res._checked_type_.dtype)
        if (data._checked_type_ == res._checked_type_):
            return data
        else:
            if res.op.name == 'broadcast_to_like':
                return tvm.relay.broadcast_to(data, list(res._checked_type_.shape))
            return res


    grmod["main"] = tvm.relay.dataflow_pattern.rewrite(LikeZapp(), grmod["main"])
    grmod = tvm.relay.transform.FoldConstant()(grmod)
    grmod = tvm.relay.transform.InferType()(grmod)
    grmod["main"] = tvm.relay.dataflow_pattern.rewrite(ZeroZapp(), grmod["main"])
    grmod["main"] = tvm.relay.dataflow_pattern.rewrite(OneZapp(), grmod["main"])
```

I get what looks realistic:

![image|690x184](upload://hdIh0laFwoahHVCGdh649tdCQv8.png)

But this is just a trivial case and if you had a hint whether some of these patterns are readily available, I would be most grateful.

Also I don't have an idea why I don't reliably get `_checked_shape_` attributes in the ZeroZapp... If you have an idea...

Best regards

Thomas





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/4) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/edf1a3df04c4e401b4a9bae6d2453278776471a7e08ef58e6b9932a4d4cbb1f8).

[TVM Discuss] [Questions] Same shape pattern

Posted by Thomas V via TVM Discuss <no...@discuss.tvm.ai>.

Yeah, it all wants to be static static to operate on.
But so what I'm after is the next step, eliminate all ops not needed in a static setting.
This seems important for anything where the graph is created automatic - with the frontend converters as well as differentiation.

Best regards

Thomas





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/16) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/1767a9e447e28d8e975077d8b35cb9ea8c6f7940b0471747eac102dad53a7ec3).

[TVM Discuss] [Questions] Same shape pattern

Posted by Matthew Brookhart via TVM Discuss <no...@discuss.tvm.ai>.

@t-vi Sorry for my delay, I had a lot of meetings today. I've finally read through this enough to grok the problem. I'm not sure the Pattern Language is the right tool for this pass. 

As you said here:

[quote="t-vi, post:3, topic:7012"]
I’m always wondering whether I’m missing ready-made passes of removing some of the typical overhead of automatic differentiation (e.g. replacing `..._like` with static ops or removing broadcasting / collapse_sum etc. If not, would these be useful to make available?
[/quote]

This looks like more of a need for removing dynamic ops. I'm actually working on a pass like that related to https://discuss.tvm.ai/t/dynamic-ops-in-relay/6909/17. The pass basically does a loop of infer_type, fold constants, replace dynamic ops with constant or static versions, repeat.

There's a POC up here: https://github.com/apache/incubator-tvm/pull/5826/files#diff-4e809b75f719ad7ca8fdad6300b3ae32

It doesn't support many use cases yet, but I can imagine plugging ones_like/zeros_like/broadcast_to_like in that pass and getting this behavior in a fairly straightforward way.





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/15) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/7e2a3043876b54c1716e8a84221e5637aae82f5e0d1859f76823adf7edb9f79a).

[TVM Discuss] [Questions] Same shape pattern

Posted by Thomas V via TVM Discuss <no...@discuss.tvm.ai>.

Thank you, yes.
So I have this graph produced by gradient (and graph normal form and removing the forward outputs) of a dense + bias_add. Obviously, the gradients would be `ones_like(output).collapse_like(bias)` and a couple of `dense( )` with `grad_out` or its transpose replacing weight and input, respectively for getting the gradient for the other.

![image|442x500](upload://rM44aXCMEIMZCycmA2KrcCygIWu.png) 

The two passes I applied so far are
```python
class ZeroZapp(tvm.relay.dataflow_pattern.DFPatternCallback):
    def __init__(self):
        self.pattern_tensor = tvm.relay.dataflow_pattern.wildcard()
        self.zeros_like = tvm.relay.dataflow_pattern.is_op("zeros_like")(self.pattern_tensor)
        self.other_tensor = tvm.relay.dataflow_pattern.wildcard()
        self.pattern = self.zeros_like + self.other_tensor

    def callback(self, pre, post, node_map):
        rt = node_map[self.pattern][0]
        ot = node_map[self.other_tensor][0]
        if (ot._checked_type_ == rt._checked_type_):
            return ot
        else:
            return tvm.relay.broadcast_to(ot, rt._checked_type_.shape)

class CollapseSumZapp(tvm.relay.dataflow_pattern.DFPatternCallback):
    def __init__(self):
        self.data_tensor = tvm.relay.dataflow_pattern.wildcard()
        self.pattern_tensor = tvm.relay.dataflow_pattern.wildcard()
        self.pattern = tvm.relay.dataflow_pattern.is_op("collapse_sum_like")(self.data_tensor, self.pattern_tensor)

    def callback(self, pre, post, node_map):
        data = node_map[self.data_tensor][0]
        res = node_map[self.pattern][0]
        if (data._checked_type_ == res._checked_type_):
            return data
        else:
            return res


grfn = tvm.relay.dataflow_pattern.rewrite(ZeroZapp(), grmod["main"])
grfn = tvm.relay.dataflow_pattern.rewrite(CollapseSumZapp(), grfn)

```
For the `CollapseSumZapp` in particular, I would replace the if in the callback by a more refined pattern. Similarly,  

So from implicit broadcasting, I have many ops in the backward. The `broadcast_like` could probably treated just as `collapse_sum_like`. 
Similarly, I might have a reshape, broadcast_to, ... that where I have a shape annotation for the input and output or I could take the input shape and the shape argument, but I don't know how to use these.

The infinite loop probably was from me doing stupid things (re-creating the final step of the caculation instead of returning the original one...).

I'm always wondering whether I'm missing ready-made passes of removing some of the typical overhead of automatic differentiation (e.g. replacing `..._like` with static ops or removing broadcasting / collapse_sum etc. If not, would these be useful to make available?

Best regards

Thomas





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/3) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/f7d0e93f68da1f20e30f480cec0a8eba05fc92bc05e27e2e899009098d7e7945).

[TVM Discuss] [Questions] Same shape pattern

Posted by "Cody H. Yu via TVM Discuss" <no...@discuss.tvm.ai>.

Could you provide example graphs before and after the pattern matching and rewriting to better illustrate your requirements?





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/2) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/69a749cfaf341b1fe5a54b6d83e1024716d69af78aa972f7ceebdfc6d96a2bff).

[TVM Discuss] [Questions] Same shape pattern

Posted by Matt Barrett via TVM Discuss <no...@discuss.tvm.ai>.

The sort of case I'm thinking of is when a mutation takes place, the mutated part of the graph won't have types associated with it (at least, not until type_infer is called on the expression again). It's not immediately obvious to me whether that's happening in this example. But now I've thought about it more, that's not a bug, it would just be a requirement that you manually propagate the type info in your mutator.





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/11) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/75c88f1901eca57ed92b89e47172f149e5007ee17beb2aa4f7a4c681a84a9ac1).

[TVM Discuss] [Questions] Same shape pattern

Posted by "Cody H. Yu via TVM Discuss" <no...@discuss.tvm.ai>.

I agree with @matt-arm. The `checked_type_` would be empty when a node is created until  `InterType` is run or a new function is added to the module. It means the later processing node may not get the type of its parents if the parents were replaced with new nodes without properly propogating their types. You could try to add `new_node.checked_type_ = old_node.checked_type_`.





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/12) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/1f6feb7f6b12cd4364e6e9e8373daa78b42b30ad6e45bb04b8f4a4dd797447da).

[TVM Discuss] [Questions] Same shape pattern

Posted by Thomas V via TVM Discuss <no...@discuss.tvm.ai>.

[quote="mbrookhart, post:13, topic:7012"]
I don’t particular want to force users to type their problems before using the pattern language in all cases.
[/quote]

I can see why. But so it seems that the shape processing gets really tedious here - with the inability to pass .shape back to relay because it is an array rather than a list being the final thing. :confused:
Maybe if there were some way of saying I want types...





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/14) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/5f01cae808c3d01bceb218e85ff8d690084d49888c8d6caa5fe4915192e9db40).

[TVM Discuss] [Questions] Same shape pattern

Posted by Matthew Brookhart via TVM Discuss <no...@discuss.tvm.ai>.

Ah, yeah, this makes sense now. 

The first LikeZapp pass will return this in certain cases:
```
tvm.relay.broadcast_to(data, list(res._checked_type_.shape))
```
Which doesn't have a type when it is constructed, but ZeroZapp later can find that node and assume it does have a type. Thus, the problem.

If you're expecting types in later passes, I think the best thing is to put InferType in your callback, or between passes as you're doing here. We could think about adding that to the rewrite infrastructure, but as I've mentioned in other threads, I don't particular want to force users to type their problems before using the pattern language in all cases.

@t-vi I'll take a closer look at your examples and see if I can figure out a way to distill it into a  more refined pattern.





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/13) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/cd2ffedb0e4527b190ccd1b0c8f22b0abdd670b21c49cb1aaeea861ffd5aec64).

[TVM Discuss] [Questions] Same shape pattern

Posted by Thomas V via TVM Discuss <no...@discuss.tvm.ai>.

Oh, that is very likely the case for me here.





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/8) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/3e50fe8fbc55c2d76f46174fa81acc81786f648181985aa61e7e5abdd6c1514d).

[TVM Discuss] [Questions] Same shape pattern

Posted by Matt Barrett via TVM Discuss <no...@discuss.tvm.ai>.

There is another way types can go awry in the dataflow matcher. When things get mutated they lose their type info until the rewrite is completed. We might want to start treating that behaviour as a bug because it's caught me out before. Maybe @mbrookhart can comment?





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/7) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/d282bd80200751cb123bcd971ad1ee8737c0e72b494659286f671482d90ad8ed).

[TVM Discuss] [Questions] Same shape pattern

Posted by Thomas V via TVM Discuss <no...@discuss.tvm.ai>.

Thank you Matt!
Oh no. :man_facepalming:   (But `checked_type` isn't the solution, unfortunately.)


I must admit the ffi is too clever for me. Without the tab completion I'm lost.
I even have a 2-line patch to fix that for classes, but I don't know where to put the unittest...





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/6) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/3676321aa2b4acb257ea275956d0418424bc3516b259f1f19961c4a2fab54da6).

[TVM Discuss] [Questions] Same shape pattern

Posted by Matt Barrett via TVM Discuss <no...@discuss.tvm.ai>.

Have you tried using checked_type rather than _checked_type_?





---
[Visit Topic](https://discuss.tvm.ai/t/same-shape-pattern/7012/5) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/f8707da041178583a5c8173db55b7dd3f456944306ecd8c4a3e51f81a24b0b26).