You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2020/06/22 20:11:38 UTC

[GitHub] [incubator-tvm] t-vi opened a new pull request #5887: keep parameter names from PyTorch

t-vi opened a new pull request #5887:
URL: https://github.com/apache/incubator-tvm/pull/5887


   This patch uses PyTorch parameter names as the name hint in variables.
   This means that one can load a stored PyTorch state dict and map the required inputs from that even after conversion.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] siju-samuel commented on pull request #5887: keep parameter names from PyTorch

Posted by GitBox <gi...@apache.org>.
siju-samuel commented on pull request #5887:
URL: https://github.com/apache/incubator-tvm/pull/5887#issuecomment-647953471


   @t-vi Thanks for the PR. 
   I think this PR has slight impact in some scenarios.
   You can confirm by running the below script before and after your change.
   
   [ pytorch_pretrained_bert_uncased.py](https://gist.github.com/siju-samuel/f67aa3e1d0568c7b53876c4fb4e7a56c)
   
   I think after this PR, the `param_tensor` and `param` is not getting assinged properly in some cases..
   Please look into this. 
   Thanks in advance.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] t-vi edited a comment on pull request #5887: keep parameter names from PyTorch

Posted by GitBox <gi...@apache.org>.
t-vi edited a comment on pull request #5887:
URL: https://github.com/apache/incubator-tvm/pull/5887#issuecomment-648010250


   I found the problem and will send a fix.
   The old code included the embedding weight twice (which arguably is a bug in itself and needs fixing) and the new code deduplicated the param but not the var (which is even worse).
   We underappreciate the structure of the PyTorch input...


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] siju-samuel commented on pull request #5887: keep parameter names from PyTorch

Posted by GitBox <gi...@apache.org>.
siju-samuel commented on pull request #5887:
URL: https://github.com/apache/incubator-tvm/pull/5887#issuecomment-647993442


   Output mismatch between pytorch and tvm.
   
   Afer this PR
   ```
     File "pytorch_pretrained_bert_uncased.py", line 112, in <module>
       tvm.testing.assert_allclose(torch_preds, compiled_output, rtol=1e-3, atol=1e-3)
     File "/home/siju/workspace/tvm/python/tvm/testing.py", line 36, in assert_allclose
       np.testing.assert_allclose(actual, desired, rtol=rtol, atol=atol, verbose=True)
     File "/home/siju/.local/lib/python3.8/site-packages/numpy/testing/_private/utils.py", line 1532, in assert_allclose
       assert_array_compare(compare, actual, desired, err_msg=str(err_msg),
     File "/home/siju/.local/lib/python3.8/site-packages/numpy/testing/_private/utils.py", line 846, in assert_array_compare
       raise AssertionError(msg)
   AssertionError: 
   Not equal to tolerance rtol=0.001, atol=0.001
   
   Mismatched elements: 427297 / 427308 (100%)
   Max absolute difference: 24.692068
   Max relative difference: 196537.89
    x: array([[[ -7.879808,  -7.787371,  -7.786093, ...,  -7.043789,
             -6.745376,  -4.60134 ],
           [-13.363304, -13.769426, -13.781861, ..., -11.81282 ,...
    y: array([[[-0.419126, -0.420205, -0.41907 , ..., -0.789973, -0.782199,
            -0.496477],
           [-0.419126, -0.420205, -0.41907 , ..., -0.789973, -0.782199,...
   
   ```
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] t-vi commented on pull request #5887: keep parameter names from PyTorch

Posted by GitBox <gi...@apache.org>.
t-vi commented on pull request #5887:
URL: https://github.com/apache/incubator-tvm/pull/5887#issuecomment-648010250


   I found the problem and will send a fix.
   The old code included the embedding weight twice (which arguably is a bug in itself and needs fixing) and the new code deduplicated the param but not the var (which is even worse).


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] t-vi commented on pull request #5887: keep parameter names from PyTorch

Posted by GitBox <gi...@apache.org>.
t-vi commented on pull request #5887:
URL: https://github.com/apache/incubator-tvm/pull/5887#issuecomment-647989008


   What is the error you are seeing?
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] t-vi commented on pull request #5887: keep parameter names from PyTorch

Posted by GitBox <gi...@apache.org>.
t-vi commented on pull request #5887:
URL: https://github.com/apache/incubator-tvm/pull/5887#issuecomment-648023393


   #5897 has the fix.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] masahi merged pull request #5887: keep parameter names from PyTorch

Posted by GitBox <gi...@apache.org>.
masahi merged pull request #5887:
URL: https://github.com/apache/incubator-tvm/pull/5887


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] masahi commented on pull request #5887: keep parameter names from PyTorch

Posted by GitBox <gi...@apache.org>.
masahi commented on pull request #5887:
URL: https://github.com/apache/incubator-tvm/pull/5887#issuecomment-647822836


   Thanks @t-vi 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org