You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2021/09/22 19:57:34 UTC

[GitHub] [tvm] shashwat14 opened a new issue #9078: [Bug] Softmax in TFLite converter is channel first instead of channel last

shashwat14 opened a new issue #9078:
URL: https://github.com/apache/tvm/issues/9078


   Compiled a TFLite toy model (conv + softmax) using TVM0.7 to generate a softmax output.
   
   ### Expected behavior
   The uint8 output distribution of the output should look something like (below is tflite runtime output)
   
   ![tflite_hist](https://user-images.githubusercontent.com/7594901/134404281-3752887b-da54-4d58-b461-1f867a6a3f4f.png)
   
   ### Actual behavior
   ![tvm_hist](https://user-images.githubusercontent.com/7594901/134404357-94463c5d-0059-47c4-a790-eafccbf01b41.png)
   
   After I change axis to 3 [here](https://github.com/apache/tvm/blob/701d2c32759c95951c4eeca229addd1036539698/python/tvm/relay/frontend/tflite.py#L684), I get output that is similar to the expected output. 
   
   ### Environment
   OS: Ubuntu 20.04
   TVM: 0.7
   
   ### Steps to reproduce
   deps: tvm 0.7, numpy, and matplotlib to generate the plot
   The zip contains the tflite model. Feel free to use a random 
   
   [tflite_model.tflite.zip](https://github.com/apache/tvm/files/7213238/tflite_model.tflite.zip)
   [Here](https://drive.google.com/file/d/1FJLyOVhFbLNWElnqNu9Ur-ki3WEXWDdF/view?usp=sharing) is the input numpy array
   ```
   import tvm
   import numpy as np
   import tflite
   import tflite_runtime.interpreter as tflite_interpreter
   
   from tvm import relay, transform
   from tvm.contrib import graph_runtime as runtime
   
   import matplotlib.pyplot as plt
   
   
   def generate_hist(x, title, filename):
       plt.title(title)
       plt.xlabel('uint8 value')
       plt.xlim([-1,256])
       plt.ylabel('frequency')
       H, bins = np.histogram(x.flatten(), bins=256, range=(0,256))
       plt.bar(bins[:-1], H)
       plt.savefig(filename)
       plt.clf()
   
   # TFLite compilation
   model_path = 'tflite_model.tflite'
   interpreter = tflite_interpreter.Interpreter(model_path=model_path)
   
   interpreter.allocate_tensors()
   
   input_details = interpreter.get_input_details()
   output_details = interpreter.get_output_details()
   
   with open('conv_softmax_input.bin', 'rb') as f:
       input_arr = np.load(f) * 255
   input_data = input_arr.astype(np.uint8)
   
   interpreter.set_tensor(input_details[0]['index'], input_data)
   interpreter.invoke()
   
   tflite_output1 = interpreter.get_tensor(output_details[0]['index'])
   
   
   # TVM Compilation
   tflite_model_buf = open(model_path, "rb").read()
   tflite_model = tflite.Model.GetRootAsModel(tflite_model_buf, 0)
   
   input_dict={"input":(1,100,160,256)}
   input_dtype = {"input":"uint8"}
   mod, params = relay.frontend.from_tflite(
       tflite_model, shape_dict=input_dict, dtype_dict=input_dtype
   )
   
   target = "llvm"
   with transform.PassContext(opt_level=2):
       lib = relay.build(mod, target, params=params)
   
   #invokes pure TVM compilation
   module = runtime.GraphModule(lib["default"](tvm.cpu()))
   module.set_input("input", tvm.nd.array(input_data))
   module.run()
   tvm_output1 = module.get_output(0).asnumpy()
   
   generate_hist(tflite_output1, 'tflite quant(uint8) hist', 'tflite_hist.png')
   generate_hist(tvm_output1, 'tvm quant(uint8) hist', 'tvm_hist.png')
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] shashwat14 removed a comment on issue #9078: [Bug] Softmax in TFLite converter is channel first instead of channel last

Posted by GitBox <gi...@apache.org>.
shashwat14 removed a comment on issue #9078:
URL: https://github.com/apache/tvm/issues/9078#issuecomment-926806805


   I think the axis is hard coded to 1 assuming the task is a simple classification task. However, the softmax can occur over any axis. In the above example it occurs for every "pixel" where each pixel has 14 dimensions. The axis parameter should always  represent the channel index (unless overridden by the user). In numpy notation, axis can be -1 for most cases. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] shashwat14 commented on issue #9078: [Bug] Softmax in TFLite converter is channel first instead of channel last

Posted by GitBox <gi...@apache.org>.
shashwat14 commented on issue #9078:
URL: https://github.com/apache/tvm/issues/9078#issuecomment-926806805


   I think the axis is hard coded to 1 assuming the task is a simple classification task. However, the softmax can occur over any axis. In the above example it occurs for every "pixel" where each pixel has 14 dimensions. The axis parameter should always  represent the channel index (unless overridden by the user). In numpy notation, axis can be -1 for most cases. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] masahi closed issue #9078: [Bug] Softmax in TFLite converter is channel first instead of channel last

Posted by GitBox <gi...@apache.org>.
masahi closed issue #9078:
URL: https://github.com/apache/tvm/issues/9078


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org