You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@tvm.apache.org by Shawhey via TVM Discuss <no...@discuss.tvm.ai> on 2019/07/21 14:14:15 UTC
[TVM Discuss] [Development] Want help adding Batch_Matmul op for
tenforflow frontend
https://github.com/dmlc/tvm/issues/3414
I am trying to add batch_mamtul op for tensorflow frontend.
I edited the files below:
python/tvm/relay/frontend/tensorflow.py,
include/tvm/relay/attrs/nn.h,
python/tvm/relay/op/nn/_nn.py,
python/tvm/relay/op/nn/nn.py,
python/tvm/relay/op/op_attrs.py,
src/relay/op/nn/nn.cc,
topi/include/topi/nn/batch_matmul.h,
topi/python/topi/generic/nn.py,
topi/python/topi/nn/batch_matmul.py,
topi/src/topi.cc
Then I built the project and it compiled without error.
And when I called
`sym, params = relay.frontend.from_tensorflow(my_graph_def, shape=my_shape)`
it worked fine.
But the next line is:
```
with relay.build_config(opt_level=3):
graph, lib, params = relay.build(sym, target='llvm', target_host='llvm', params=params)
```
And here I got a bug,
tvm/python/tvm/_ffi/_ctypes/function.py raises get_last_ffi_error, it also raises TypeError: batch_matmul() argument after ** must be a mapping, not NoneType, saying
> tvm/python/tvm/relay/op/nn/_nn.py", line 75, in compute_batch_matmul
> return [topi.nn.batch_matmul(inputs[0],inputs[1],**attrs)]
I need some help ...
---
[Visit Topic](https://discuss.tvm.ai/t/want-help-adding-batch-matmul-op-for-tenforflow-frontend/3418/1) to respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/e6964266521573661219e3778f7306c830f73ae6d327fd0a130d723dd4358a81).
Tianqi Chen, UW, Seattle, WA, 98105, United States
http://tracking.discuss.tvm.ai/tracking/unsubscribe?msgid=AsQb_KQPYgENGUhfPOkfqQ2