You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2020/06/15 01:00:29 UTC

[GitHub] [incubator-tvm] hypercubestart opened a new pull request #5808: [Relay] Support for ADT in Lazy Gradient Init Pass

hypercubestart opened a new pull request #5808:
URL: https://github.com/apache/incubator-tvm/pull/5808


   Lazy Gradient Initialization currently only worked on tensor types. This adds functionality to potentially recursive ADTs, functions, and TupleTypes.
   
   Also:
   - added `Check` function to IRModule to easily type infer potentially mutually recursive functions
   - exposed `map_free_vars` argument in StructuralEqual
   
   I would really appreciate a review @MarisaKirisame @altanh @joshpoll @slyubomirsky 
   
   Thanks!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] hypercubestart closed pull request #5808: [Relay] Support for ADT in Lazy Gradient Init Pass

Posted by GitBox <gi...@apache.org>.
hypercubestart closed pull request #5808:
URL: https://github.com/apache/incubator-tvm/pull/5808


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] MarisaKirisame commented on pull request #5808: [Relay] Support for ADT in Lazy Gradient Init Pass

Posted by GitBox <gi...@apache.org>.
MarisaKirisame commented on pull request #5808:
URL: https://github.com/apache/incubator-tvm/pull/5808#issuecomment-644667025


   @hypercubestart there is two thing, one is extending relay to work on mutually recursive function, and one is custom ad pass. can you separate the two and make 2 pr (probably the mutual recursion one rn, and only the other once it is merged)? by doing this it reduce reviewer expertise requirement (some ppl know how to do mutual recursion but dont know ad, for example), and leave better git commit message. generally speaking we should break each pr to do one thing.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org