You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/02/19 00:36:51 UTC

[GitHub] ZhennanQin commented on issue #14123: Add int8 data loader

ZhennanQin commented on issue #14123: Add int8 data loader
URL: https://github.com/apache/incubator-mxnet/pull/14123#issuecomment-464933057
 
 
   @anirudh2290 Thanks for reviewing. That idea was in my head, but I was a bit worried about letting quantize_v2 accept int8 dtypes looks a bit weird, and would not be accepted by community.  I have to say, that way is good for user experience as user can use single int8 model for both fp32 and int8 data loader. If you think it's doable, then I will refactor towards it. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services