You are viewing a plain text version of this content. The canonical link for it is here.
Posted to discuss-archive@tvm.apache.org by Ryan Chang via TVM Discuss <no...@discuss.tvm.ai> on 2020/08/25 09:20:42 UTC

[TVM Discuss] [Questions] Why can't I save the file


Tvm version

    commit f6e00a0d8b384e8cd81b90668c06f68cd52161b9
    Author: Iswariya Manivannan <36...@users.noreply.github.com>
    Date:   Mon Aug 17 05:53:38 2020 +0200

This is my code

    {
        ...
        mod, params = relay.frontend.from_tflite(tflite_model,
                                                 shape_dict={input_tensor: input_shape},
                                                 dtype_dict={input_tensor: input_dtype})

            print(mod)

            with relay.quantize.qconfig(nbit_activation=32,
                                    dtype_input="int8",
                                    dtype_weight="int8",
                                    dtype_activation="int32",
                                    skip_conv_layers=[0]
                                    ):
                print("   ", relay.quantize.current_qconfig())
                mod = relay.quantize.quantize(mod, params)
        ...
            # compile the model on Relay
            with relay.build_config(opt_level = 3):
                graph, lib, params = relay.build_module.build(mod, target, params=params)

            # model save
            lib.save("test.ll",fmt="ll")
            with open('test.json', 'w') as _f:
                    _f.write(graph)
            with open('test.params', 'wb') as _f:
                    _f.write(relay.save_param_dict(params))
     }

This code is ok to save (test.ll, test.json, test. params)  but the following will be wrong.

    {
        ...
        mod, params = relay.frontend.from_tflite(tflite_model,
                                                 shape_dict={input_tensor: input_shape},
                                                 dtype_dict={input_tensor: input_dtype})

            print(mod)

            with relay.quantize.qconfig(nbit_activation=32,
                                    dtype_input="int8",
                                    dtype_weight="int8",
                                    dtype_activation="int32",
                                    skip_conv_layers=[0]
                                    ):
                print("   ", relay.quantize.current_qconfig())
                mod = relay.quantize.quantize(mod, params)

        pattern_table = [
            ("dnnl.conv2d_add_right_shift_clip_cast", make_pattern_())
        ]
          
        mod["main"] = run_opt_pass(mod["main"], relay.transform.MergeComposite(pattern_table))
        mod = composite_partition(mod)
        mod = transform.AnnotateTarget(["dnnl"])(mod)
        mod = transform.MergeCompilerRegions()(mod)
        ...

            # compile the model on Relay
            with relay.build_config(opt_level = 3):
                graph, lib, params = relay.build_module.build(mod, target, params=params)

            # model save
            lib.save("test.ll",fmt="ll")
            with open('test.json', 'w') as _f:
                    _f.write(graph)
            with open('test.params', 'wb') as _f:
                    _f.write(relay.save_param_dict(params))
    }

error message

    test_pass_partition_graph.py:1469: DeprecationWarning: legacy graph runtime behaviour of producing json / lib / params will be removed in the next release
      graph, lib, params = relay.build_module.build(mod, target, params=params)
    Traceback (most recent call last):
      File "test_pass_partition_graph.py", line 1630, in <module>
        test_extern_dnnl_LeNet()
      File "test_pass_partition_graph.py", line 1475, in test_extern_dnnl_LeNet
        lib.save("test.ll",fmt="ll")
      File "/home/ymchang/tvm-master/python/tvm/runtime/module.py", line 164, in save
        _ffi_api.ModuleSaveToFile(self, file_name, fmt)
      File "/home/ymchang/tvm-master/python/tvm/_ffi/_ctypes/packed_func.py", line 225, in __call__
        raise get_last_ffi_error()
    tvm._ffi.base.TVMError: Traceback (most recent call last):
      [bt] (3) /home/ymchang/tvm-master/build/libtvm.so(TVMFuncCall+0x61) [0x7f7c64852631]
      [bt] (2) /home/ymchang/tvm-master/build/libtvm.so(+0x14ed260) [0x7f7c64871260]
      [bt] (1) /home/ymchang/tvm-master/build/libtvm.so(tvm::runtime::ModuleNode::SaveToFile(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)+0x8d) [0x7f7c6486d31d]
      [bt] (0) /home/ymchang/tvm-master/build/libtvm.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x61) [0x7f7c63c6d7c1]
      File "/home/ymchang/tvm-master/src/runtime/module.cc", line 90
    TVMError: Module[metadata] does not support SaveToFile





---
[Visit Topic](https://discuss.tvm.ai/t/why-cant-i-save-the-file/7718/1) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/527662b00273d3469ee2027558ab5a83f6a89161744bec295de06ca67d30ca01).

[TVM Discuss] [Questions] Why can't I save the file

Posted by Zhi via TVM Discuss <no...@discuss.tvm.ai>.

Yeah, this is due to missing SaveToFile in MetadataModule. It is inconvenient to save them to a file though because there would be multiple modules (i.e. LLVMModule and some subgraph modules).





---
[Visit Topic](https://discuss.tvm.ai/t/why-cant-i-save-the-file/7718/3) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/e448151f7bc7c9cdb6c69b8ebe8e5c89670b06eafe8f78df3c94082a3d2065a9).

[TVM Discuss] [Questions] Why can't I save the file

Posted by "Cody H. Yu via TVM Discuss" <no...@discuss.tvm.ai>.

@zhiics it looks like a problem of missing SavetoFile implementation.





---
[Visit Topic](https://discuss.tvm.ai/t/why-cant-i-save-the-file/7718/2) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/751725e1b89c69205571886ad1f5983c0bb14811b083d08d53dc49e76c25b8e1).