You are viewing a plain text version of this content. The canonical link for it is here.
Posted to discuss-archive@tvm.apache.org by Thomas V via TVM Discuss <no...@discuss.tvm.ai> on 2020/06/17 11:29:21 UTC
[TVM Discuss] [Questions] Relay Gradients
Hello,
I have been toying around with the gradient relay transformation and wondered if I am doing something wrong to get a rather elaborate gradient:
![linear|690x247](upload://wgrWF1xcKY67TwFabk6Rn5MAC4c.png)
gets transformed into:
![grad_linear|469x500](upload://mTnPXYdgholApZHbIaN6W1mjIcQ.png)
I must admit that that is a bit more than I had hoped for...
Now I realize that symbolic differentiation is bound to create very complex graphs, but I can't help but wonder whether I did something wrong. (And there might be optimization passes disconnecting some of it.)
I am grateful for any hint you might have.
Best regards
Thomas
---
[Visit Topic](https://discuss.tvm.ai/t/relay-gradients/7002/1) to respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/a7619dedf4e321bfefa7a248b1def9862f9d062de4b067495a1cb120e1341760).
[TVM Discuss] [Questions] Relay Gradients
Posted by Thomas V via TVM Discuss <no...@discuss.tvm.ai>.
So I'm slowly wrapping my head around this.
To not only contribute questions all the time:
If I wanted to use the pattern language to simplify e.g. the Let, I would need to make a Let pattern, right? If that would be useful to have, I could submit a patch for that, maybe using TuplePattern as guidance.
Best regards
Thomas
---
[Visit Topic](https://discuss.tvm.ai/t/relay-gradients/7002/2) to respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/19c7fe6b333a2bcbf1b9b53f92bc139d15fca28c0e46830854e7acf14cb8d5b5).