You are viewing a plain text version of this content. The canonical link for it is here.
Posted to discuss-archive@tvm.apache.org by Max Sponner via TVM Discuss <no...@discuss.tvm.ai> on 2020/08/19 13:04:23 UTC

[TVM Discuss] [Questions] Codegeneration for own DLA Instruction Set


I have another problem:

The graph annotation has been defined (by adding my own version at python/tvm/relay/op/contrib/test_dla.py)

But it seems like that is not enough to get the annotation going, as 

`mod_t = transform.AnnotateTarget("test_dla")(mod)`

followed by 

`mod_t = transform.PartitionGraph()(mod_t)`

results in no change to the representation

what did I miss, to enable my specific annotation?


The graph in question looks like this:

    def @main(%x: Tensor[(10, 10), int8], %y: Tensor[(10, 10), int8]) -> Tensor[(10, 10), int8] {
      %0 = multiply(%y, %y) /* ty=Tensor[(10, 10), int8] */;
      %1 = add(%x, %x) /* ty=Tensor[(10, 10), int8] */;
      subtract(%0, %1) /* ty=Tensor[(10, 10), int8] */
    }

And my annotations: (I trief replacing qnn.add with add and the same for subtract, but maybe I have forgotten to register my annotation somewhere?)

    @tvm.ir.register_op_attr("qnn.add", target_name)
    def add(attr, args):
        ''' check if tensor addition is supported by DLA'''
        typ = args[0].checked_type

        if typ.dtype != "int8":
            return False

        #TODO: how to test for equal shapes?
        return True

    @tvm.ir.register_op_attr("qnn.subtract", target_name)
    def sub(attr, args):
        ''' check if tensor addition is supported by DLA'''
        typ = args[0].checked_type

        if typ.dtype != "int8":
            return False

        #TODO: how to test for equal shapes?
        return True





---
[Visit Topic](https://discuss.tvm.ai/t/codegeneration-for-own-dla-instruction-set/7538/15) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/766ea2e3ff8ae34763e90a50b7e0021536d82408ad0d585a256aa3258a1d6ac0).