You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by tq...@apache.org on 2024/02/16 23:36:58 UTC

(tvm-site) branch asf-site updated: deploying docs (apache/tvm@5645c52c6d3105fb6c58cb7e1d983eff6ff26c19)

This is an automated email from the ASF dual-hosted git repository.

tqchen pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/tvm-site.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new 1b19a574e1 deploying docs (apache/tvm@5645c52c6d3105fb6c58cb7e1d983eff6ff26c19)
1b19a574e1 is described below

commit 1b19a574e14cea6294e78d34703f758590178a90
Author: tvm-bot <95...@users.noreply.github.com>
AuthorDate: Fri Feb 16 23:36:52 2024 +0000

    deploying docs (apache/tvm@5645c52c6d3105fb6c58cb7e1d983eff6ff26c19)
---
 .../how_to/compile_models/from_darknet.rst.txt     |   2 +-
 .../how_to/compile_models/from_oneflow.rst.txt     |   2 +-
 .../how_to/compile_models/from_paddle.rst.txt      |   5 -
 .../how_to/compile_models/from_pytorch.rst.txt     |   2 +-
 .../how_to/compile_models/from_tensorflow.rst.txt  |   2 +-
 .../compile_models/sg_execution_times.rst.txt      |  20 +-
 docs/_sources/how_to/deploy/index.rst.txt          |   1 +
 docs/_sources/how_to/deploy/mrvl.rst.txt           | 235 ++++++++++++++++
 .../deploy_models/deploy_model_on_adreno.rst.txt   |   4 +-
 .../deploy_model_on_adreno_tvmc.rst.txt            |   2 +-
 .../deploy_models/deploy_model_on_android.rst.txt  |   2 +-
 .../deploy_object_detection_pytorch.rst.txt        |   4 +-
 .../deploy_models/deploy_prequantized.rst.txt      |   6 +-
 .../deploy_prequantized_tflite.rst.txt             |   2 +-
 .../deploy_models/sg_execution_times.rst.txt       |  20 +-
 .../how_to/extend_tvm/sg_execution_times.rst.txt   |   6 +-
 .../how_to/extend_tvm/use_pass_instrument.rst.txt  |  16 +-
 .../optimize_operators/opt_conv_cuda.rst.txt       |   2 +-
 .../optimize_operators/opt_conv_tensorcore.rst.txt |   2 +-
 .../how_to/optimize_operators/opt_gemm.rst.txt     |  16 +-
 .../optimize_operators/sg_execution_times.rst.txt  |   8 +-
 .../sg_execution_times.rst.txt                     |  12 +-
 .../tune_conv2d_layer_cuda.rst.txt                 |   2 +-
 .../tune_network_cuda.rst.txt                      |   4 +-
 .../tune_network_x86.rst.txt                       |   4 +-
 .../tune_with_autotvm/sg_execution_times.rst.txt   |   4 +-
 .../tune_with_autotvm/tune_conv2d_cuda.rst.txt     |   2 +-
 .../work_with_microtvm/micro_autotune.rst.txt      |  18 +-
 .../work_with_microtvm/micro_pytorch.rst.txt       |   4 +-
 .../how_to/work_with_microtvm/micro_train.rst.txt  |  16 +-
 .../work_with_microtvm/sg_execution_times.rst.txt  |  18 +-
 .../work_with_relay/sg_execution_times.rst.txt     |   8 +-
 .../how_to/work_with_schedules/intrin_math.rst.txt |   2 +-
 .../work_with_schedules/sg_execution_times.rst.txt |  18 +-
 .../tutorials/frontend/deploy_detection.rst.txt    |   2 +-
 .../tutorials/frontend/sg_execution_times.rst.txt  |   4 +-
 .../tutorials/optimize/sg_execution_times.rst.txt  |   6 +-
 .../topic/vta/tutorials/sg_execution_times.rst.txt |   6 +-
 .../tutorial/auto_scheduler_matmul_x86.rst.txt     |  11 +-
 docs/_sources/tutorial/autotvm_matmul_x86.rst.txt  |  20 +-
 docs/_sources/tutorial/autotvm_relay_x86.rst.txt   |  58 ++--
 .../tutorial/cross_compilation_and_rpc.rst.txt     |   2 +-
 docs/_sources/tutorial/intro_topi.rst.txt          |   2 +-
 docs/_sources/tutorial/sg_execution_times.rst.txt  |  18 +-
 .../tutorial/tensor_expr_get_started.rst.txt       |  49 ++--
 docs/commit_hash                                   |   2 +-
 docs/how_to/compile_models/from_darknet.html       |   2 +-
 docs/how_to/compile_models/from_oneflow.html       |  15 +-
 docs/how_to/compile_models/from_paddle.html        |   1 -
 docs/how_to/compile_models/from_pytorch.html       |  19 +-
 docs/how_to/compile_models/from_tensorflow.html    |   2 +-
 docs/how_to/compile_models/sg_execution_times.html |  20 +-
 docs/how_to/deploy/adreno.html                     |   1 +
 docs/how_to/deploy/android.html                    |   1 +
 docs/how_to/deploy/arm_compute_lib.html            |   1 +
 docs/how_to/deploy/bnns.html                       |   5 +-
 docs/how_to/deploy/cpp_deploy.html                 |   1 +
 docs/how_to/deploy/hls.html                        |   1 +
 docs/how_to/deploy/index.html                      |  15 ++
 docs/how_to/deploy/integrate.html                  |   1 +
 docs/how_to/deploy/{bnns.html => mrvl.html}        | 299 +++++++++++----------
 docs/how_to/deploy/tensorrt.html                   |   1 +
 docs/how_to/deploy/vitis_ai.html                   |   1 +
 .../deploy_models/deploy_model_on_adreno.html      |   4 +-
 .../deploy_models/deploy_model_on_adreno_tvmc.html |  25 +-
 .../deploy_models/deploy_model_on_android.html     |   2 +-
 .../deploy_object_detection_pytorch.html           |  62 +++--
 docs/how_to/deploy_models/deploy_prequantized.html |  10 +-
 .../deploy_models/deploy_prequantized_tflite.html  |   2 +-
 docs/how_to/deploy_models/index.html               |   4 +-
 docs/how_to/deploy_models/sg_execution_times.html  |  24 +-
 docs/how_to/extend_tvm/sg_execution_times.html     |   6 +-
 docs/how_to/extend_tvm/use_pass_instrument.html    |  16 +-
 docs/how_to/optimize_operators/opt_conv_cuda.html  |   2 +-
 .../optimize_operators/opt_conv_tensorcore.html    |   2 +-
 docs/how_to/optimize_operators/opt_gemm.html       |  16 +-
 .../optimize_operators/sg_execution_times.html     |   8 +-
 .../sg_execution_times.html                        |  12 +-
 .../tune_conv2d_layer_cuda.html                    |   2 +-
 .../tune_with_autoscheduler/tune_network_cuda.html |   4 +-
 .../tune_with_autoscheduler/tune_network_x86.html  |   4 +-
 .../tune_with_autotvm/sg_execution_times.html      |   4 +-
 .../how_to/tune_with_autotvm/tune_conv2d_cuda.html |   2 +-
 docs/how_to/work_with_microtvm/micro_autotune.html |  18 +-
 docs/how_to/work_with_microtvm/micro_pytorch.html  |   6 +-
 docs/how_to/work_with_microtvm/micro_train.html    |  16 +-
 .../work_with_microtvm/sg_execution_times.html     |  18 +-
 .../how_to/work_with_relay/sg_execution_times.html |   8 +-
 docs/how_to/work_with_schedules/intrin_math.html   |   2 +-
 .../work_with_schedules/sg_execution_times.html    |  18 +-
 docs/objects.inv                                   | Bin 25652 -> 25672 bytes
 docs/reference/api/python/auto_scheduler.html      |   4 +-
 .../api/typedoc/classes/ArtifactCache.html         |   8 +-
 docs/reference/api/typedoc/classes/DLDataType.html |  14 +-
 docs/reference/api/typedoc/classes/DLDevice.html   |  12 +-
 docs/reference/api/typedoc/classes/Instance.html   | 104 +++----
 docs/reference/api/typedoc/classes/Module.html     |  12 +-
 docs/reference/api/typedoc/classes/NDArray.html    |  30 +--
 docs/reference/api/typedoc/classes/RPCServer.html  |  16 +-
 docs/reference/api/typedoc/classes/Scalar.html     |   8 +-
 docs/reference/api/typedoc/classes/TVMArray.html   |  16 +-
 docs/reference/api/typedoc/classes/TVMObject.html  |  12 +-
 .../api/typedoc/classes/VirtualMachine.html        |  10 +-
 .../classes/_internal_.CachedCallStack.html        |  36 +--
 .../classes/_internal_.CanvasRenderManager.html    |  10 +-
 .../typedoc/classes/_internal_.Environment.html    |  14 +-
 .../api/typedoc/classes/_internal_.FFILibrary.html |  22 +-
 .../api/typedoc/classes/_internal_.Memory.html     |  36 +--
 .../typedoc/classes/_internal_.PackedFuncCell.html |   8 +-
 .../typedoc/classes/_internal_.RuntimeContext.html |  54 ++--
 .../api/typedoc/classes/_internal_.TVMString.html  |  14 +-
 .../typedoc/classes/_internal_.WebGPUContext.html  |  28 +-
 .../typedoc/enums/_internal_.RPCServerState.html   |  14 +-
 docs/reference/api/typedoc/functions/assert.html   |   2 +-
 .../api/typedoc/functions/createPolyfillWASI.html  |   2 +-
 .../api/typedoc/functions/detectGPUDevice.html     |   2 +-
 .../api/typedoc/functions/hasNDArrayInCache.html   |   2 +-
 .../api/typedoc/functions/instantiate.html         |   2 +-
 docs/reference/api/typedoc/functions/wasmPath.html |   2 +-
 .../api/typedoc/interfaces/Disposable.html         |   4 +-
 .../typedoc/interfaces/GPUDeviceDetectOutput.html  |   8 +-
 .../api/typedoc/interfaces/InitProgressReport.html |  10 +-
 .../api/typedoc/interfaces/LibraryProvider.html    |   6 +-
 .../interfaces/_internal_.FunctionInfo.html        |   8 +-
 .../interfaces/_internal_.NDArrayCacheEntry.html   |  14 +-
 .../interfaces/_internal_.NDArrayShardEntry.html   |  10 +-
 .../api/typedoc/types/InitProgressCallback.html    |   2 +-
 docs/reference/api/typedoc/types/PackedFunc.html   |   2 +-
 .../types/_internal_.FObjectConstructor.html       |   2 +-
 .../types/_internal_.FTVMWasmAllocSpace.html       |   2 +-
 .../types/_internal_.FTVMWasmFreeSpace.html        |   2 +-
 .../types/_internal_.FTVMWasmPackedCFunc.html      |   2 +-
 .../api/typedoc/types/_internal_.Pointer.html      |   2 +-
 .../typedoc/types/_internal_.TVMObjectBase.html    |   2 +-
 docs/searchindex.js                                |   2 +-
 .../vta/tutorials/frontend/deploy_detection.html   |   2 +-
 .../vta/tutorials/frontend/sg_execution_times.html |   4 +-
 .../vta/tutorials/optimize/sg_execution_times.html |   6 +-
 docs/topic/vta/tutorials/sg_execution_times.html   |   6 +-
 docs/tutorial/auto_scheduler_matmul_x86.html       |   7 +-
 docs/tutorial/autotvm_matmul_x86.html              |  20 +-
 docs/tutorial/autotvm_relay_x86.html               | 268 +++++++++---------
 docs/tutorial/cross_compilation_and_rpc.html       |   2 +-
 docs/tutorial/intro_topi.html                      |   2 +-
 docs/tutorial/sg_execution_times.html              |  22 +-
 docs/tutorial/tensor_expr_get_started.html         |  45 ++--
 146 files changed, 1283 insertions(+), 1007 deletions(-)

diff --git a/docs/_sources/how_to/compile_models/from_darknet.rst.txt b/docs/_sources/how_to/compile_models/from_darknet.rst.txt
index a4e46cca8e..ca0fb4b67b 100644
--- a/docs/_sources/how_to/compile_models/from_darknet.rst.txt
+++ b/docs/_sources/how_to/compile_models/from_darknet.rst.txt
@@ -318,7 +318,7 @@ The process is no different from other examples.
 
 .. rst-class:: sphx-glr-timing
 
-   **Total running time of the script:** ( 1 minutes  34.628 seconds)
+   **Total running time of the script:** ( 1 minutes  33.114 seconds)
 
 
 .. _sphx_glr_download_how_to_compile_models_from_darknet.py:
diff --git a/docs/_sources/how_to/compile_models/from_oneflow.rst.txt b/docs/_sources/how_to/compile_models/from_oneflow.rst.txt
index 07c240ae94..8d40eba8d7 100644
--- a/docs/_sources/how_to/compile_models/from_oneflow.rst.txt
+++ b/docs/_sources/how_to/compile_models/from_oneflow.rst.txt
@@ -121,7 +121,7 @@ Load a pretrained OneFlow model and save model
  .. code-block:: none
 
     Downloading: "https://oneflow-public.oss-cn-beijing.aliyuncs.com/model_zoo/flowvision/classification/ResNet/resnet18.zip" to /workspace/.oneflow/flowvision_cache/resnet18.zip
-       0%|          | 0.00/41.5M [00:00<?, ?B/s]      15%|#5        | 6.33M/41.5M [00:00<00:00, 65.6MB/s]      35%|###4      | 14.3M/41.5M [00:00<00:00, 76.2MB/s]      52%|#####2    | 21.6M/41.5M [00:00<00:00, 41.1MB/s]      64%|######4   | 26.7M/41.5M [00:00<00:00, 25.5MB/s]      77%|#######7  | 32.0M/41.5M [00:01<00:00, 29.1MB/s]      92%|#########2| 38.3M/41.5M [00:01<00:00, 35.2MB/s]     100%|##########| 41.5M/41.5M [00:01<00:00, 36.1MB/s]
+       0%|          | 0.00/41.5M [00:00<?, ?B/s]      19%|#8        | 7.72M/41.5M [00:00<00:00, 80.9MB/s]      37%|###7      | 15.4M/41.5M [00:00<00:00, 33.5MB/s]      48%|####8     | 20.0M/41.5M [00:00<00:00, 32.6MB/s]      57%|#####7    | 23.8M/41.5M [00:00<00:00, 21.9MB/s]      64%|######3   | 26.5M/41.5M [00:01<00:00, 20.2MB/s]      77%|#######7  | 32.0M/41.5M [00:01<00:00, 25.2MB/s]      92%|#########2| 38.3M/41.5M [00:01<00:00, 30.4MB/s]     100%|##########| 41.5M/41.5M [00:01<00:00, 28.6MB/s]
 
 
 
diff --git a/docs/_sources/how_to/compile_models/from_paddle.rst.txt b/docs/_sources/how_to/compile_models/from_paddle.rst.txt
index d916031556..8df75c9e33 100644
--- a/docs/_sources/how_to/compile_models/from_paddle.rst.txt
+++ b/docs/_sources/how_to/compile_models/from_paddle.rst.txt
@@ -207,11 +207,6 @@ Look up prediction top 1 index in 1000 class synset.
 
 
 
-.. rst-class:: sphx-glr-timing
-
-   **Total running time of the script:** ( 1 minutes  1.325 seconds)
-
-
 .. _sphx_glr_download_how_to_compile_models_from_paddle.py:
 
 .. only:: html
diff --git a/docs/_sources/how_to/compile_models/from_pytorch.rst.txt b/docs/_sources/how_to/compile_models/from_pytorch.rst.txt
index 2ffe122e5b..90f4f7aedc 100644
--- a/docs/_sources/how_to/compile_models/from_pytorch.rst.txt
+++ b/docs/_sources/how_to/compile_models/from_pytorch.rst.txt
@@ -101,7 +101,7 @@ Load a pretrained PyTorch model
     /venv/apache-tvm-py3.8/lib/python3.8/site-packages/torchvision/models/_utils.py:223: UserWarning: Arguments other than a weight enum or `None` for 'weights' are deprecated since 0.13 and may be removed in the future. The current behavior is equivalent to passing `weights=ResNet18_Weights.IMAGENET1K_V1`. You can also use `weights=ResNet18_Weights.DEFAULT` to get the most up-to-date weights.
       warnings.warn(msg)
     Downloading: "https://download.pytorch.org/models/resnet18-f37072fd.pth" to /workspace/.cache/torch/hub/checkpoints/resnet18-f37072fd.pth
-       0%|          | 0.00/44.7M [00:00<?, ?B/s]      15%|#5        | 6.84M/44.7M [00:00<00:00, 71.7MB/s]      31%|###       | 13.7M/44.7M [00:00<00:00, 52.3MB/s]      42%|####2     | 19.0M/44.7M [00:00<00:00, 45.7MB/s]      54%|#####3    | 24.0M/44.7M [00:00<00:00, 38.8MB/s]      72%|#######1  | 32.0M/44.7M [00:00<00:00, 38.0MB/s]      86%|########5 | 38.3M/44.7M [00:01<00:00, 35.1MB/s]      94%|#########3| 41.8M/44.7M [00:01<00:00, 32.9MB/s]     100%|##########| 44.7M/44.7M [00:01<00:00, 39.9MB/s]
+       0%|          | 0.00/44.7M [00:00<?, ?B/s]      14%|#4        | 6.30M/44.7M [00:00<00:00, 44.3MB/s]      24%|##3       | 10.5M/44.7M [00:00<00:00, 39.6MB/s]      32%|###2      | 14.3M/44.7M [00:00<00:00, 33.1MB/s]      39%|###9      | 17.5M/44.7M [00:00<00:00, 30.1MB/s]      54%|#####3    | 24.0M/44.7M [00:00<00:00, 40.9MB/s]      63%|######2   | 28.1M/44.7M [00:00<00:00, 33.9MB/s]      71%|#######   | 31.7M/44.7M [00:00<00:00, 33.1MB/s]      78%|#######8  | 35.0M/44.7M [00:01<00:00, 29.2MB/s]      86%|########5 | 38.3M/44.7M [00:01<00:00, 28.8MB/s]     100%|#########9| 44.7M/44.7M [00:01<00:00, 38.1MB/s]     100%|##########| 44.7M/44.7M [00:01<00:00, 34.9MB/s]
 
 
 
diff --git a/docs/_sources/how_to/compile_models/from_tensorflow.rst.txt b/docs/_sources/how_to/compile_models/from_tensorflow.rst.txt
index 203be5d7f9..26f3cefe4f 100644
--- a/docs/_sources/how_to/compile_models/from_tensorflow.rst.txt
+++ b/docs/_sources/how_to/compile_models/from_tensorflow.rst.txt
@@ -430,7 +430,7 @@ Run the corresponding model on tensorflow
 
 .. rst-class:: sphx-glr-timing
 
-   **Total running time of the script:** ( 1 minutes  32.962 seconds)
+   **Total running time of the script:** ( 1 minutes  27.491 seconds)
 
 
 .. _sphx_glr_download_how_to_compile_models_from_tensorflow.py:
diff --git a/docs/_sources/how_to/compile_models/sg_execution_times.rst.txt b/docs/_sources/how_to/compile_models/sg_execution_times.rst.txt
index 2a288dd9bd..d4fe32c39e 100644
--- a/docs/_sources/how_to/compile_models/sg_execution_times.rst.txt
+++ b/docs/_sources/how_to/compile_models/sg_execution_times.rst.txt
@@ -5,24 +5,24 @@
 
 Computation times
 =================
-**06:41.067** total execution time for **how_to_compile_models** files:
+**06:22.235** total execution time for **how_to_compile_models** files:
 
 +-----------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_compile_models_from_darknet.py` (``from_darknet.py``)       | 01:34.628 | 0.0 MB |
+| :ref:`sphx_glr_how_to_compile_models_from_darknet.py` (``from_darknet.py``)       | 01:33.114 | 0.0 MB |
 +-----------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_compile_models_from_tensorflow.py` (``from_tensorflow.py``) | 01:32.962 | 0.0 MB |
+| :ref:`sphx_glr_how_to_compile_models_from_tensorflow.py` (``from_tensorflow.py``) | 01:27.491 | 0.0 MB |
 +-----------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_compile_models_from_paddle.py` (``from_paddle.py``)         | 01:01.325 | 0.0 MB |
+| :ref:`sphx_glr_how_to_compile_models_from_paddle.py` (``from_paddle.py``)         | 00:57.633 | 0.0 MB |
 +-----------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_compile_models_from_oneflow.py` (``from_oneflow.py``)       | 00:43.983 | 0.0 MB |
+| :ref:`sphx_glr_how_to_compile_models_from_oneflow.py` (``from_oneflow.py``)       | 00:41.333 | 0.0 MB |
 +-----------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_compile_models_from_coreml.py` (``from_coreml.py``)         | 00:37.967 | 0.0 MB |
+| :ref:`sphx_glr_how_to_compile_models_from_coreml.py` (``from_coreml.py``)         | 00:35.889 | 0.0 MB |
 +-----------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_compile_models_from_pytorch.py` (``from_pytorch.py``)       | 00:29.332 | 0.0 MB |
+| :ref:`sphx_glr_how_to_compile_models_from_pytorch.py` (``from_pytorch.py``)       | 00:27.349 | 0.0 MB |
 +-----------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_compile_models_from_keras.py` (``from_keras.py``)           | 00:25.769 | 0.0 MB |
+| :ref:`sphx_glr_how_to_compile_models_from_keras.py` (``from_keras.py``)           | 00:24.610 | 0.0 MB |
 +-----------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_compile_models_from_tflite.py` (``from_tflite.py``)         | 00:12.048 | 0.0 MB |
+| :ref:`sphx_glr_how_to_compile_models_from_tflite.py` (``from_tflite.py``)         | 00:12.040 | 0.0 MB |
 +-----------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_compile_models_from_onnx.py` (``from_onnx.py``)             | 00:03.054 | 0.0 MB |
+| :ref:`sphx_glr_how_to_compile_models_from_onnx.py` (``from_onnx.py``)             | 00:02.776 | 0.0 MB |
 +-----------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/how_to/deploy/index.rst.txt b/docs/_sources/how_to/deploy/index.rst.txt
index ac1e2a1276..4c3f30964b 100644
--- a/docs/_sources/how_to/deploy/index.rst.txt
+++ b/docs/_sources/how_to/deploy/index.rst.txt
@@ -176,6 +176,7 @@ target device without relying on RPC. See the following resources on how to do s
    tensorrt
    vitis_ai
    bnns
+   mrvl
 
 Additional Deployment How-Tos
 -----------------------------
diff --git a/docs/_sources/how_to/deploy/mrvl.rst.txt b/docs/_sources/how_to/deploy/mrvl.rst.txt
new file mode 100644
index 0000000000..0b0b81ed34
--- /dev/null
+++ b/docs/_sources/how_to/deploy/mrvl.rst.txt
@@ -0,0 +1,235 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+Marvell Machine Learning Integration
+====================================
+
+1. Introduction
+---------------
+Marvell(R) supports a family of high performance Data Processing
+Units (DPUs) with integrated compute, high speed I/O and workload
+accelerators. These workload accelerators includes Marvell's
+Machine Learning Inference Processor (MLIP), a highly optimized,
+integrated inference engine.
+
+TVM supports Marvell's MLIP using the "mrvl" library. This partitions and
+compiles supported operations for accelerated execution on MLIP, or LLVM
+for general compute.
+
+For runtime, the library supports native execution on MLIP hardware
+as well as Marvell's ML simulator (mlModel).
+
+The library supports Marvell's Octeon family of processors with ML accelarators.
+
+This guide demonstrates building TVM with codegen and
+runtime enabled. It also provides example code to compile and run
+models using 'mrvl' runtime.
+
+2. Building TVM with mrvl support
+---------------------------------
+
+2.1 Clone TVM repo
+-------------------
+
+Refer to the following TVM documentation for cloning TVM
+https://tvm.apache.org/docs/install/from_source.html
+
+2.2 Build and start the TVM - mrvl docker container
+----------------------------------------------------
+
+.. code:: bash
+
+    ./docker/build.sh demo_mrvl bash                              # Build the docker container
+    ./docker/bash.sh tvm.demo_mrvl --env PYTHONPATH=$PWD/python   # Load the docker image
+
+
+3. Build TVM inside the docker container with mrvl (inside tvm directory)
+-------------------------------------------------------------------------
+
+.. code:: bash
+
+      ./tests/scripts/task_config_build_mrvl.sh build
+      cd build
+      cmake ..
+      make -j$(nproc)   # nproc = 4/8/..  (Number of Parallel jobs)
+
+4. Compiling a model using TVMC command line
+--------------------------------------------
+Models can be compiled and run for mrvl target using TVMC
+which is optimized for performance.
+
+Refer to the following TVMC documentation, for tvmc generic options.
+https://tvm.apache.org/docs/tutorial/tvmc_command_line_driver.html
+
+Additional mrvl-specific options may be added as attributes if
+necessary. The advanced usage is described in this document below.
+
+4.1 TVMC Compilation Flow for a model
+-------------------------------------
+
+Refer to the following TVM documentation, for compilation flow
+https://tvm.apache.org/docs/arch/index.html#example-compilation-flow
+
+
+4.2. TVMC - Command line option(s): Syntax for mrvl target
+----------------------------------------------------------
+
+Compiling an ONNX model using the tvmc for mrvl target.
+
+**Syntax:**
+
+.. code:: python
+
+    python3 -m tvm.driver.tvmc compile --target="mrvl, llvm"
+        --target-llvm-<options>
+        --target-mrvl-<options>
+        --<tvm-generic-options>
+        model_file.onnx
+
+Following is an example TVMC Compile command for an ARMv9 core and
+integrated MLIP cn10ka processor, using only 4 tiles in the block.
+
+**Example:**
+
+.. code:: python
+
+    python3 -m tvm.driver.tvmc compile --target="mrvl, llvm" \
+        --target-llvm-mtriple=aarch64-linux-gnu --target-llvm-mcpu=neoverse-n2 \
+        --target-mrvl-num_tiles=4 \
+        --cross-compiler aarch64-linux-gnu-gcc \
+        --output model.tar \
+        mnist-12.onnx
+
+
+4.3. TVMC Compiler: mrvl specific Command Line Options
+------------------------------------------------------
+
+.. code:: python
+
+  --target-mrvl-mcpu
+  --target-mrvl-num_tiles
+  --target-mrvl-mattr
+
+**Description of mrvl options**
+
+* mcpu:
+    The CPU class of Marvell(R) ML Inference Processor;
+    possible values = {cn10ka, cnf10kb}; defaults to cn10ka
+
+* num_tiles:
+    Maximum number of tiles that may be used, possible values = {1,2,4,8}, defaults to 8
+
+* mattr:
+    Attributes for mrvl; possible values = {quantize, wb_pin_ocm}
+
+    mattr specifies the data type, code generation options and optimizations.
+
+    *List of supported attributes are:*
+
+    **1. quantize**
+
+    Specify the data type. Possible values = {fp16, int8}.
+    Default is fp16, int8 is WIP and full support will be added in a future PR.
+
+    **2. wb_pin_ocm**
+
+    Optimize runtime by preloading a model's weights and bias into
+    the on chip memory. Possible values = {0, 1}. Default is 0 (no preload)
+
+5. Compilation - Generating model partitions
+--------------------------------------------
+
+In the TVMC mrvl flow, the model is partitioned into Marvell and LLVM regions.
+Building each partitioned Marvell subgraph generates serialized nodes.json and
+const.json. Partitioned nodes.json is the representation of the model graph which is
+suitable for the Marvell mmlc compiler. It is distributed separately via CDK
+
+**Model Partition**
+
+.. code:: bash
+
+    python3 -m tvm.driver.tvmc compile --target="mrvl, llvm \
+    -mtriple=aarch64-linux-gnu -mcpu=neoverse-n2" \
+    --cross-compiler aarch64-linux-gnu-gcc \
+    --target-mrvl-num_tiles=4 --output model.tar model.onnx
+
+
+6. Compiling a model using Python APIs
+--------------------------------------
+
+In addition to using TVMC, models can also be compiled and run using
+TVM Python API. Below is an example to compile the MNIST model. Support
+to run the model will be part of next PR by mrvl
+
+**Download MNIST model from the web**
+
+.. code:: bash
+
+    cd $HOME
+    wget https://github.com/onnx/models/raw/main/validated/vision/classification/mnist/model/mnist-12.onnx
+
+**Import the TVM and other dependent modules**
+
+.. code:: python
+
+    import tvm, onnx, os
+    import numpy as np
+    import tvm.relay as relay
+    from tvm.relay.op.contrib.mrvl import partition_for_mrvl
+    from tvm.relay.build_module import build
+    from keras.datasets import mnist
+
+**Load model onnx file**
+
+.. code:: python
+
+    onnx_model = onnx.load("mnist-12.onnx")
+
+**Create a Relay graph from MNIST model**
+
+.. code:: python
+
+    shape_dict = {'Input3' : (1,1,28,28)}
+    mod, params = relay.frontend.from_onnx(onnx_model, shape_dict)
+
+**Define option dictionary and Partition the Model**
+
+Annotate and partition the graph for mrvl. All operations which are supported
+by the mrvl will be marked and offloaded to mrvl hardware accelerator. The rest of the
+operations will go through the regular LLVM compilation and code generation for ARM.
+
+.. code:: python
+
+    tvm_target = "llvm"
+
+    option_dict = {'num_tiles': 4}
+
+    mod = partition_for_mrvl(mod, params, **option_dict)
+
+**Build the Relay Graph**
+
+Build the Relay graph, using the new module returned by partition_for_mrvl.
+The target must always be a LLVM (ARM) target. ``partition_for_mrvl`` will
+pass the options from dictionary into the config parameters needed by the
+compiler backend, so there is no need to modify it - just pass it along
+to the PassContext so the values can be read during compilation.
+
+.. code:: python
+
+    with tvm.transform.PassContext(opt_level=3, config={"relay.ext.mrvl.options" : option_dict}):
+            model_lib = relay.build(mod, tvm_target, params=params)
diff --git a/docs/_sources/how_to/deploy_models/deploy_model_on_adreno.rst.txt b/docs/_sources/how_to/deploy_models/deploy_model_on_adreno.rst.txt
index e8c786147c..439eeff1bd 100644
--- a/docs/_sources/how_to/deploy_models/deploy_model_on_adreno.rst.txt
+++ b/docs/_sources/how_to/deploy_models/deploy_model_on_adreno.rst.txt
@@ -673,7 +673,7 @@ well as provides information about the model's performance
     Evaluate inference time cost...
     Execution time summary:
      mean (ms)   median (ms)    max (ms)     min (ms)     std (ms)  
-     3997.9147    3995.0318    4020.8693    3993.1790      7.8819                  
+     3993.9828    3989.9656    4021.5015    3988.6683      9.4269                  
 
 
 
@@ -681,7 +681,7 @@ well as provides information about the model's performance
 
 .. rst-class:: sphx-glr-timing
 
-   **Total running time of the script:** ( 1 minutes  17.688 seconds)
+   **Total running time of the script:** ( 1 minutes  16.408 seconds)
 
 
 .. _sphx_glr_download_how_to_deploy_models_deploy_model_on_adreno.py:
diff --git a/docs/_sources/how_to/deploy_models/deploy_model_on_adreno_tvmc.rst.txt b/docs/_sources/how_to/deploy_models/deploy_model_on_adreno_tvmc.rst.txt
index 13163b4413..eb0577aea2 100644
--- a/docs/_sources/how_to/deploy_models/deploy_model_on_adreno_tvmc.rst.txt
+++ b/docs/_sources/how_to/deploy_models/deploy_model_on_adreno_tvmc.rst.txt
@@ -127,7 +127,7 @@ Make a Keras Resnet50 Model
  .. code-block:: none
 
     Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/resnet/resnet50_weights_tf_dim_ordering_tf_kernels.h5
-          8192/102967424 [..............................] - ETA: 0s       4653056/102967424 [>.............................] - ETA: 1s       8380416/102967424 [=>............................] - ETA: 2s      15024128/102967424 [===>..........................] - ETA: 1s      16769024/102967424 [===>..........................] - ETA: 1s      23412736/102967424 [=====>........................] - ETA: 1s      25157632/102967424 [======>.......................] - ETA: 1s      33546240/102967424 [========>.....................] - ETA: 1s
       41934848/102967424 [===========>..................] - ETA: 0s      48578560/102967424 [=============>................] - ETA: 0s      50323456/102967424 [=============>................] - ETA: 0s      52518912/102967424 [==============>...............] - ETA: 0s      58728448/102967424 [================>.............] - ETA: 0s      67100672/102967424 [==================>...........] - ETA: 0s      75489280/102967424 [====================>.........] - ETA: 0s      83877888/102967424 [=======================>......] -
  ETA: 0s      92266496/102967424 [=========================>....] - ETA: 0s     100646912/102967424 [============================>.] - ETA: 0s     102850560/102967424 [============================>.] - ETA: 0s     102967424/102967424 [==============================] - 1s 0us/step
+          8192/102967424 [..............................] - ETA: 0s       1572864/102967424 [..............................] - ETA: 3s       2785280/102967424 [..............................] - ETA: 4s       8380416/102967424 [=>............................] - ETA: 2s      16769024/102967424 [===>..........................] - ETA: 2s      16908288/102967424 [===>..........................] - ETA: 2s      23412736/102967424 [=====>........................] - ETA: 1s      25157632/102967424 [======>.......................] - ETA: 1s
       33546240/102967424 [========>.....................] - ETA: 1s      41934848/102967424 [===========>..................] - ETA: 1s      50323456/102967424 [=============>................] - ETA: 1s      58712064/102967424 [================>.............] - ETA: 0s      67100672/102967424 [==================>...........] - ETA: 0s      69296128/102967424 [===================>..........] - ETA: 0s      72753152/102967424 [====================>.........] - ETA: 0s      73744384/102967424 [====================>.........] -
  ETA: 0s      75489280/102967424 [====================>.........] - ETA: 0s      83877888/102967424 [=======================>......] - ETA: 0s      90521600/102967424 [=========================>....] - ETA: 0s      92405760/102967424 [=========================>....] - ETA: 0s     100368384/102967424 [============================>.] - ETA: 0s     100646912/102967424 [============================>.] - ETA: 0s     102967424/102967424 [==============================] - 2s 0us/step
 
 
 
diff --git a/docs/_sources/how_to/deploy_models/deploy_model_on_android.rst.txt b/docs/_sources/how_to/deploy_models/deploy_model_on_android.rst.txt
index 26d2643594..39c1e7167c 100644
--- a/docs/_sources/how_to/deploy_models/deploy_model_on_android.rst.txt
+++ b/docs/_sources/how_to/deploy_models/deploy_model_on_android.rst.txt
@@ -437,7 +437,7 @@ Execute on TVM
     Evaluate inference time cost...
     Execution time summary:
      mean (ms)   median (ms)    max (ms)     min (ms)     std (ms)  
-      14.6827      14.6408      14.8761      14.5288       0.1319                  
+      14.2411      14.0100      14.7648      13.7551       0.3886                  
 
 
 
diff --git a/docs/_sources/how_to/deploy_models/deploy_object_detection_pytorch.rst.txt b/docs/_sources/how_to/deploy_models/deploy_object_detection_pytorch.rst.txt
index e9fa8c7d60..cf0fc4b256 100644
--- a/docs/_sources/how_to/deploy_models/deploy_object_detection_pytorch.rst.txt
+++ b/docs/_sources/how_to/deploy_models/deploy_object_detection_pytorch.rst.txt
@@ -130,7 +130,7 @@ Load pre-trained maskrcnn from torchvision and do tracing
     /venv/apache-tvm-py3.8/lib/python3.8/site-packages/torchvision/models/_utils.py:223: UserWarning: Arguments other than a weight enum or `None` for 'weights' are deprecated since 0.13 and may be removed in the future. The current behavior is equivalent to passing `weights=MaskRCNN_ResNet50_FPN_Weights.COCO_V1`. You can also use `weights=MaskRCNN_ResNet50_FPN_Weights.DEFAULT` to get the most up-to-date weights.
       warnings.warn(msg)
     Downloading: "https://download.pytorch.org/models/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth" to /workspace/.cache/torch/hub/checkpoints/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth
-       0%|          | 0.00/170M [00:00<?, ?B/s]       4%|4         | 7.47M/170M [00:00<00:02, 78.3MB/s]       9%|8         | 14.9M/170M [00:00<00:05, 31.6MB/s]      11%|#1        | 19.3M/170M [00:00<00:05, 28.1MB/s]      14%|#4        | 24.0M/170M [00:00<00:04, 31.8MB/s]      19%|#8        | 32.0M/170M [00:00<00:04, 35.6MB/s]      24%|##3       | 40.0M/170M [00:01<00:03, 42.0MB/s]      28%|##8       | 48.0M/170M [00:01<00:03, 39.5MB/s]      33%|###2      | 56.0M/170M [00:01<00:02, 42.7MB/s]      37%|###6      | 62.3M/170M [00:01<00:02, 45.8MB/s]      40%|###9      | 67.3M/170M [00:01<00:02, 47.2MB/s]      42%|####2     | 72.1M/170M [00:01<00:02, 43.8MB/s]      47%|####7     | 80.0M/170M [00:02<00:02, 45.9MB/s]      52%|#####1    | 88.0M/170M [00:02<00:01, 44.8MB/s]      56%|#####5    | 94.3M/170M [00:02<00:01, 46.9MB/s]      58%|#####8    | 98.9M/170M [00:02<00:01, 39.8MB/s]      61%|######1   | 104M/170M [00:02<00:01, 39.8MB/s]       66%|######5   | 112M/170M [00:02<00:01, 48.0MB/s
 ]      71%|#######   | 120M/170M [00:02<00:00, 52.8MB/s]      74%|#######4  | 126M/170M [00:03<00:00, 55.8MB/s]      78%|#######7  | 132M/170M [00:03<00:00, 51.7MB/s]      81%|########  | 137M/170M [00:03<00:00, 41.4MB/s]      85%|########4 | 144M/170M [00:03<00:00, 45.5MB/s]      88%|########8 | 150M/170M [00:03<00:00, 47.6MB/s]      91%|#########1| 155M/170M [00:03<00:00, 37.7MB/s]      94%|#########4| 160M/170M [00:03<00:00, 37.2MB/s]      98%|#########7| 166M/170M [00:04<00:00, 33.9MB/s]     100%|##########| 170M/170M [00:04<00:00, 41.5MB/s]
+       0%|          | 0.00/170M [00:00<?, ?B/s]       4%|3         | 6.30M/170M [00:00<00:04, 41.1MB/s]       6%|6         | 10.2M/170M [00:00<00:05, 32.4MB/s]       8%|8         | 14.3M/170M [00:00<00:07, 21.0MB/s]      10%|9         | 16.6M/170M [00:00<00:07, 20.5MB/s]      14%|#4        | 24.0M/170M [00:00<00:05, 27.7MB/s]      19%|#8        | 32.0M/170M [00:01<00:04, 34.6MB/s]      24%|##3       | 40.0M/170M [00:01<00:03, 42.2MB/s]      27%|##7       | 46.3M/170M [00:01<00:03, 40.2MB/s]      30%|##9       | 50.3M/170M [00:01<00:03, 40.2MB/s]      33%|###2      | 56.0M/170M [00:01<00:02, 40.9MB/s]      37%|###6      | 62.3M/170M [00:01<00:02, 44.6MB/s]      39%|###9      | 66.7M/170M [00:01<00:02, 40.1MB/s]      42%|####2     | 72.0M/170M [00:02<00:02, 42.7MB/s]      46%|####6     | 78.3M/170M [00:02<00:02, 41.2MB/s]      48%|####8     | 82.3M/170M [00:02<00:02, 34.5MB/s]      51%|#####     | 86.3M/170M [00:02<00:02, 35.2MB/s]      53%|#####2    | 89.8M/170M [00:02<00:02, 32.6MB/
 s]      56%|#####5    | 94.3M/170M [00:02<00:02, 29.5MB/s]      57%|#####7    | 97.2M/170M [00:03<00:02, 27.0MB/s]      60%|######    | 102M/170M [00:03<00:02, 28.2MB/s]       62%|######1   | 105M/170M [00:03<00:02, 22.9MB/s]      66%|######5   | 112M/170M [00:03<00:01, 31.9MB/s]      71%|#######   | 120M/170M [00:03<00:01, 40.2MB/s]      74%|#######4  | 126M/170M [00:03<00:01, 43.6MB/s]      77%|#######7  | 131M/170M [00:03<00:00, 41.5MB/s]      80%|########  | 136M/170M [00:04<00:01, 34.4MB/s]      85%|########4 | 144M/170M [00:04<00:00, 42.0MB/s]      88%|########8 | 150M/170M [00:04<00:00, 46.7MB/s]      91%|#########1| 155M/170M [00:04<00:00, 43.2MB/s]      94%|#########3| 160M/170M [00:04<00:00, 34.9MB/s]      96%|#########6| 163M/170M [00:04<00:00, 33.6MB/s]      98%|#########8| 167M/170M [00:05<00:00, 28.7MB/s]     100%|##########| 170M/170M [00:05<00:00, 34.8MB/s]
     /venv/apache-tvm-py3.8/lib/python3.8/site-packages/torch/nn/functional.py:3912: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
       (torch.floor((input.size(i + 2).float() * torch.tensor(scale_factors[i], dtype=torch.float32)).float()))
     /venv/apache-tvm-py3.8/lib/python3.8/site-packages/torchvision/ops/boxes.py:157: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
@@ -292,7 +292,7 @@ Get boxes with score larger than 0.9
 
 .. rst-class:: sphx-glr-timing
 
-   **Total running time of the script:** ( 3 minutes  30.916 seconds)
+   **Total running time of the script:** ( 3 minutes  20.110 seconds)
 
 
 .. _sphx_glr_download_how_to_deploy_models_deploy_object_detection_pytorch.py:
diff --git a/docs/_sources/how_to/deploy_models/deploy_prequantized.rst.txt b/docs/_sources/how_to/deploy_models/deploy_prequantized.rst.txt
index bb94af3c54..de71055ac8 100644
--- a/docs/_sources/how_to/deploy_models/deploy_prequantized.rst.txt
+++ b/docs/_sources/how_to/deploy_models/deploy_prequantized.rst.txt
@@ -227,7 +227,7 @@ training. Other models require a full post training calibration.
     /venv/apache-tvm-py3.8/lib/python3.8/site-packages/torchvision/models/_utils.py:223: UserWarning: Arguments other than a weight enum or `None` for 'weights' are deprecated since 0.13 and may be removed in the future. The current behavior is equivalent to passing `weights=MobileNet_V2_Weights.IMAGENET1K_V1`. You can also use `weights=MobileNet_V2_Weights.DEFAULT` to get the most up-to-date weights.
       warnings.warn(msg)
     Downloading: "https://download.pytorch.org/models/mobilenet_v2-b0353104.pth" to /workspace/.cache/torch/hub/checkpoints/mobilenet_v2-b0353104.pth
-       0%|          | 0.00/13.6M [00:00<?, ?B/s]      59%|#####8    | 7.99M/13.6M [00:00<00:00, 29.7MB/s]      90%|########9 | 12.2M/13.6M [00:00<00:00, 33.5MB/s]     100%|##########| 13.6M/13.6M [00:00<00:00, 35.9MB/s]
+       0%|          | 0.00/13.6M [00:00<?, ?B/s]      47%|####6     | 6.30M/13.6M [00:00<00:00, 39.8MB/s]      75%|#######4  | 10.1M/13.6M [00:00<00:00, 38.4MB/s]     100%|##########| 13.6M/13.6M [00:00<00:00, 33.4MB/s]
 
 
 
@@ -409,7 +409,7 @@ Here we give an example of how to measure performance of TVM compiled models.
 
     Execution time summary:
      mean (ms)   median (ms)    max (ms)     min (ms)     std (ms)  
-      86.1134      86.0590      87.1695      85.8505       0.2320                  
+      86.0615      86.0674      90.4477      85.4856       0.5247                  
 
 
 
@@ -457,7 +457,7 @@ TODO
 
 .. rst-class:: sphx-glr-timing
 
-   **Total running time of the script:** ( 1 minutes  27.347 seconds)
+   **Total running time of the script:** ( 1 minutes  24.367 seconds)
 
 
 .. _sphx_glr_download_how_to_deploy_models_deploy_prequantized.py:
diff --git a/docs/_sources/how_to/deploy_models/deploy_prequantized_tflite.rst.txt b/docs/_sources/how_to/deploy_models/deploy_prequantized_tflite.rst.txt
index 148747553f..9e2ca9c81b 100644
--- a/docs/_sources/how_to/deploy_models/deploy_prequantized_tflite.rst.txt
+++ b/docs/_sources/how_to/deploy_models/deploy_prequantized_tflite.rst.txt
@@ -423,7 +423,7 @@ Here we give an example of how to measure performance of TVM compiled models.
 
     Execution time summary:
      mean (ms)   median (ms)    max (ms)     min (ms)     std (ms)  
-      107.8047     107.4867     137.6822     106.9999      3.0148                  
+      103.5347     103.3730     105.4180     102.4175      0.7573                  
 
 
 
diff --git a/docs/_sources/how_to/deploy_models/sg_execution_times.rst.txt b/docs/_sources/how_to/deploy_models/sg_execution_times.rst.txt
index 4eba617191..1b9e39bc76 100644
--- a/docs/_sources/how_to/deploy_models/sg_execution_times.rst.txt
+++ b/docs/_sources/how_to/deploy_models/sg_execution_times.rst.txt
@@ -5,24 +5,24 @@
 
 Computation times
 =================
-**09:41.173** total execution time for **how_to_deploy_models** files:
+**09:19.801** total execution time for **how_to_deploy_models** files:
 
 +------------------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_deploy_models_deploy_object_detection_pytorch.py` (``deploy_object_detection_pytorch.py``) | 03:30.916 | 0.0 MB |
+| :ref:`sphx_glr_how_to_deploy_models_deploy_object_detection_pytorch.py` (``deploy_object_detection_pytorch.py``) | 03:20.110 | 0.0 MB |
 +------------------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_deploy_models_deploy_prequantized.py` (``deploy_prequantized.py``)                         | 01:27.347 | 0.0 MB |
+| :ref:`sphx_glr_how_to_deploy_models_deploy_prequantized.py` (``deploy_prequantized.py``)                         | 01:24.367 | 0.0 MB |
 +------------------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_deploy_models_deploy_model_on_adreno.py` (``deploy_model_on_adreno.py``)                   | 01:17.688 | 0.0 MB |
+| :ref:`sphx_glr_how_to_deploy_models_deploy_model_on_adreno.py` (``deploy_model_on_adreno.py``)                   | 01:16.408 | 0.0 MB |
 +------------------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_deploy_models_deploy_prequantized_tflite.py` (``deploy_prequantized_tflite.py``)           | 00:51.214 | 0.0 MB |
+| :ref:`sphx_glr_how_to_deploy_models_deploy_prequantized_tflite.py` (``deploy_prequantized_tflite.py``)           | 00:49.258 | 0.0 MB |
 +------------------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_deploy_models_deploy_model_on_android.py` (``deploy_model_on_android.py``)                 | 00:50.199 | 0.0 MB |
+| :ref:`sphx_glr_how_to_deploy_models_deploy_model_on_android.py` (``deploy_model_on_android.py``)                 | 00:48.145 | 0.0 MB |
 +------------------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_deploy_models_deploy_model_on_adreno_tvmc.py` (``deploy_model_on_adreno_tvmc.py``)         | 00:45.313 | 0.0 MB |
+| :ref:`sphx_glr_how_to_deploy_models_deploy_model_on_adreno_tvmc.py` (``deploy_model_on_adreno_tvmc.py``)         | 00:44.474 | 0.0 MB |
 +------------------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_deploy_models_deploy_model_on_nano.py` (``deploy_model_on_nano.py``)                       | 00:29.302 | 0.0 MB |
+| :ref:`sphx_glr_how_to_deploy_models_deploy_model_on_rasp.py` (``deploy_model_on_rasp.py``)                       | 00:28.678 | 0.0 MB |
 +------------------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_deploy_models_deploy_model_on_rasp.py` (``deploy_model_on_rasp.py``)                       | 00:29.183 | 0.0 MB |
+| :ref:`sphx_glr_how_to_deploy_models_deploy_model_on_nano.py` (``deploy_model_on_nano.py``)                       | 00:28.351 | 0.0 MB |
 +------------------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_deploy_models_deploy_sparse.py` (``deploy_sparse.py``)                                     | 00:00.011 | 0.0 MB |
+| :ref:`sphx_glr_how_to_deploy_models_deploy_sparse.py` (``deploy_sparse.py``)                                     | 00:00.010 | 0.0 MB |
 +------------------------------------------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/how_to/extend_tvm/sg_execution_times.rst.txt b/docs/_sources/how_to/extend_tvm/sg_execution_times.rst.txt
index 7bc7e55a00..ce0ca9d1f9 100644
--- a/docs/_sources/how_to/extend_tvm/sg_execution_times.rst.txt
+++ b/docs/_sources/how_to/extend_tvm/sg_execution_times.rst.txt
@@ -5,12 +5,12 @@
 
 Computation times
 =================
-**00:03.951** total execution time for **how_to_extend_tvm** files:
+**00:03.781** total execution time for **how_to_extend_tvm** files:
 
 +-------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_extend_tvm_use_pass_instrument.py` (``use_pass_instrument.py``)     | 00:02.781 | 0.0 MB |
+| :ref:`sphx_glr_how_to_extend_tvm_use_pass_instrument.py` (``use_pass_instrument.py``)     | 00:02.665 | 0.0 MB |
 +-------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_extend_tvm_use_pass_infra.py` (``use_pass_infra.py``)               | 00:01.163 | 0.0 MB |
+| :ref:`sphx_glr_how_to_extend_tvm_use_pass_infra.py` (``use_pass_infra.py``)               | 00:01.109 | 0.0 MB |
 +-------------------------------------------------------------------------------------------+-----------+--------+
 | :ref:`sphx_glr_how_to_extend_tvm_low_level_custom_pass.py` (``low_level_custom_pass.py``) | 00:00.007 | 0.0 MB |
 +-------------------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/how_to/extend_tvm/use_pass_instrument.rst.txt b/docs/_sources/how_to/extend_tvm/use_pass_instrument.rst.txt
index 6d6bd83d1d..6fde349b6a 100644
--- a/docs/_sources/how_to/extend_tvm/use_pass_instrument.rst.txt
+++ b/docs/_sources/how_to/extend_tvm/use_pass_instrument.rst.txt
@@ -220,10 +220,10 @@ profile the execution time of each passes.
  .. code-block:: none
 
     Printing results of timing profile...
-    InferType: 25716us [25716us] (49.16%; 49.16%)
-    FoldScaleAxis: 26597us [8us] (50.84%; 50.84%)
-            FoldConstant: 26588us [1795us] (50.83%; 99.97%)
-                    InferType: 24793us [24793us] (47.39%; 93.25%)
+    InferType: 24175us [24175us] (48.76%; 48.76%)
+    FoldScaleAxis: 25403us [6us] (51.24%; 51.24%)
+            FoldConstant: 25396us [1739us] (51.22%; 99.97%)
+                    InferType: 23657us [23657us] (47.72%; 93.15%)
 
 
 
@@ -262,10 +262,10 @@ Refer to following sections and :py:func:`tvm.instrument.pass_instrument` for th
  .. code-block:: none
 
     Printing results of timing profile...
-    InferType: 24354us [24354us] (48.30%; 48.30%)
-    FoldScaleAxis: 26071us [7us] (51.70%; 51.70%)
-            FoldConstant: 26064us [1818us] (51.69%; 99.97%)
-                    InferType: 24246us [24246us] (48.08%; 93.02%)
+    InferType: 23391us [23391us] (48.55%; 48.55%)
+    FoldScaleAxis: 24783us [5us] (51.45%; 51.45%)
+            FoldConstant: 24778us [1648us] (51.43%; 99.98%)
+                    InferType: 23130us [23130us] (48.01%; 93.35%)
 
 
 
diff --git a/docs/_sources/how_to/optimize_operators/opt_conv_cuda.rst.txt b/docs/_sources/how_to/optimize_operators/opt_conv_cuda.rst.txt
index 33d9e4fc91..673985d9e9 100644
--- a/docs/_sources/how_to/optimize_operators/opt_conv_cuda.rst.txt
+++ b/docs/_sources/how_to/optimize_operators/opt_conv_cuda.rst.txt
@@ -331,7 +331,7 @@ latency of convolution.
 
  .. code-block:: none
 
-    Convolution: 33.872543 ms
+    Convolution: 35.691360 ms
 
 
 
diff --git a/docs/_sources/how_to/optimize_operators/opt_conv_tensorcore.rst.txt b/docs/_sources/how_to/optimize_operators/opt_conv_tensorcore.rst.txt
index 2de65c2c2a..f7c268af38 100644
--- a/docs/_sources/how_to/optimize_operators/opt_conv_tensorcore.rst.txt
+++ b/docs/_sources/how_to/optimize_operators/opt_conv_tensorcore.rst.txt
@@ -598,7 +598,7 @@ be able to run on our build server
 
  .. code-block:: none
 
-    conv2d with tensor core: 12.265424 ms
+    conv2d with tensor core: 12.264919 ms
 
 
 
diff --git a/docs/_sources/how_to/optimize_operators/opt_gemm.rst.txt b/docs/_sources/how_to/optimize_operators/opt_gemm.rst.txt
index 85995aa18c..40b4744113 100644
--- a/docs/_sources/how_to/optimize_operators/opt_gemm.rst.txt
+++ b/docs/_sources/how_to/optimize_operators/opt_gemm.rst.txt
@@ -134,8 +134,8 @@ Then we write a baseline implementation, the simplest way to write a matrix mult
 
  .. code-block:: none
 
-    Numpy running time: 0.016915
-    Baseline: 3.385801
+    Numpy running time: 0.013416
+    Baseline: 3.391921
 
 
 
@@ -227,7 +227,7 @@ fill 32 * 32 * sizeof(float) which is 4KB in the cache whose total size is 32KB
 
  .. code-block:: none
 
-    Opt1: 0.300350
+    Opt1: 0.291530
 
 
 
@@ -318,7 +318,7 @@ In this tutorial, we chose to vectorize the inner loop row data since it is cach
 
  .. code-block:: none
 
-    Opt2: 0.259783
+    Opt2: 0.259740
 
 
 
@@ -406,7 +406,7 @@ the access pattern for A matrix is more cache friendly.
 
  .. code-block:: none
 
-    Opt3: 0.112768
+    Opt3: 0.110507
 
 
 
@@ -523,7 +523,7 @@ flattening.
 
  .. code-block:: none
 
-    Opt4: 0.103754
+    Opt4: 0.103994
 
 
 
@@ -635,7 +635,7 @@ write to C when all the block results are ready.
 
  .. code-block:: none
 
-    Opt5: 0.104403
+    Opt5: 0.094832
 
 
 
@@ -748,7 +748,7 @@ Furthermore, we can also utilize multi-core processors to do the thread-level pa
 
  .. code-block:: none
 
-    Opt6: 0.123906
+    Opt6: 0.111884
 
 
 
diff --git a/docs/_sources/how_to/optimize_operators/sg_execution_times.rst.txt b/docs/_sources/how_to/optimize_operators/sg_execution_times.rst.txt
index da4a42afd8..7cf75f7fa1 100644
--- a/docs/_sources/how_to/optimize_operators/sg_execution_times.rst.txt
+++ b/docs/_sources/how_to/optimize_operators/sg_execution_times.rst.txt
@@ -5,12 +5,12 @@
 
 Computation times
 =================
-**00:33.003** total execution time for **how_to_optimize_operators** files:
+**00:31.557** total execution time for **how_to_optimize_operators** files:
 
 +-----------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_optimize_operators_opt_gemm.py` (``opt_gemm.py``)                       | 00:29.838 | 0.0 MB |
+| :ref:`sphx_glr_how_to_optimize_operators_opt_gemm.py` (``opt_gemm.py``)                       | 00:28.564 | 0.0 MB |
 +-----------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_optimize_operators_opt_conv_tensorcore.py` (``opt_conv_tensorcore.py``) | 00:01.940 | 0.0 MB |
+| :ref:`sphx_glr_how_to_optimize_operators_opt_conv_tensorcore.py` (``opt_conv_tensorcore.py``) | 00:01.846 | 0.0 MB |
 +-----------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_optimize_operators_opt_conv_cuda.py` (``opt_conv_cuda.py``)             | 00:01.225 | 0.0 MB |
+| :ref:`sphx_glr_how_to_optimize_operators_opt_conv_cuda.py` (``opt_conv_cuda.py``)             | 00:01.147 | 0.0 MB |
 +-----------------------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/how_to/tune_with_autoscheduler/sg_execution_times.rst.txt b/docs/_sources/how_to/tune_with_autoscheduler/sg_execution_times.rst.txt
index 199784d866..6fd67d03f5 100644
--- a/docs/_sources/how_to/tune_with_autoscheduler/sg_execution_times.rst.txt
+++ b/docs/_sources/how_to/tune_with_autoscheduler/sg_execution_times.rst.txt
@@ -5,18 +5,18 @@
 
 Computation times
 =================
-**03:33.708** total execution time for **how_to_tune_with_autoscheduler** files:
+**03:25.824** total execution time for **how_to_tune_with_autoscheduler** files:
 
 +----------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_tune_with_autoscheduler_tune_network_x86.py` (``tune_network_x86.py``)             | 01:32.074 | 0.0 MB |
+| :ref:`sphx_glr_how_to_tune_with_autoscheduler_tune_network_x86.py` (``tune_network_x86.py``)             | 01:28.951 | 0.0 MB |
 +----------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_tune_with_autoscheduler_tune_network_cuda.py` (``tune_network_cuda.py``)           | 01:11.910 | 0.0 MB |
+| :ref:`sphx_glr_how_to_tune_with_autoscheduler_tune_network_cuda.py` (``tune_network_cuda.py``)           | 01:09.182 | 0.0 MB |
 +----------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_tune_with_autoscheduler_tune_network_arm.py` (``tune_network_arm.py``)             | 00:17.441 | 0.0 MB |
+| :ref:`sphx_glr_how_to_tune_with_autoscheduler_tune_network_arm.py` (``tune_network_arm.py``)             | 00:16.840 | 0.0 MB |
 +----------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_tune_with_autoscheduler_tune_network_mali.py` (``tune_network_mali.py``)           | 00:16.124 | 0.0 MB |
+| :ref:`sphx_glr_how_to_tune_with_autoscheduler_tune_network_mali.py` (``tune_network_mali.py``)           | 00:15.610 | 0.0 MB |
 +----------------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_tune_with_autoscheduler_tune_conv2d_layer_cuda.py` (``tune_conv2d_layer_cuda.py``) | 00:16.062 | 0.0 MB |
+| :ref:`sphx_glr_how_to_tune_with_autoscheduler_tune_conv2d_layer_cuda.py` (``tune_conv2d_layer_cuda.py``) | 00:15.144 | 0.0 MB |
 +----------------------------------------------------------------------------------------------------------+-----------+--------+
 | :ref:`sphx_glr_how_to_tune_with_autoscheduler_tune_sparse_x86.py` (``tune_sparse_x86.py``)               | 00:00.098 | 0.0 MB |
 +----------------------------------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/how_to/tune_with_autoscheduler/tune_conv2d_layer_cuda.rst.txt b/docs/_sources/how_to/tune_with_autoscheduler/tune_conv2d_layer_cuda.rst.txt
index 5f6088c0e4..c38d9b6f64 100644
--- a/docs/_sources/how_to/tune_with_autoscheduler/tune_conv2d_layer_cuda.rst.txt
+++ b/docs/_sources/how_to/tune_with_autoscheduler/tune_conv2d_layer_cuda.rst.txt
@@ -766,7 +766,7 @@ We build the binary and check its correctness and performance.
 
  .. code-block:: none
 
-    Execution time of this operator: 0.344 ms
+    Execution time of this operator: 0.345 ms
 
 
 
diff --git a/docs/_sources/how_to/tune_with_autoscheduler/tune_network_cuda.rst.txt b/docs/_sources/how_to/tune_with_autoscheduler/tune_network_cuda.rst.txt
index dc76bab057..ffe1f2db33 100644
--- a/docs/_sources/how_to/tune_with_autoscheduler/tune_network_cuda.rst.txt
+++ b/docs/_sources/how_to/tune_with_autoscheduler/tune_network_cuda.rst.txt
@@ -633,7 +633,7 @@ so we can read the log file and load the best schedules.
     Evaluate inference time cost...
     Execution time summary:
      mean (ms)   median (ms)    max (ms)     min (ms)     std (ms)  
-       3.2051       3.2046       3.2064       3.2044       0.0009                  
+       3.2105       3.2106       3.2107       3.2103       0.0002                  
 
 
 
@@ -660,7 +660,7 @@ Other Tips
 
 .. rst-class:: sphx-glr-timing
 
-   **Total running time of the script:** ( 1 minutes  11.910 seconds)
+   **Total running time of the script:** ( 1 minutes  9.182 seconds)
 
 
 .. _sphx_glr_download_how_to_tune_with_autoscheduler_tune_network_cuda.py:
diff --git a/docs/_sources/how_to/tune_with_autoscheduler/tune_network_x86.rst.txt b/docs/_sources/how_to/tune_with_autoscheduler/tune_network_x86.rst.txt
index 3c9598f980..09776612bd 100644
--- a/docs/_sources/how_to/tune_with_autoscheduler/tune_network_x86.rst.txt
+++ b/docs/_sources/how_to/tune_with_autoscheduler/tune_network_x86.rst.txt
@@ -655,7 +655,7 @@ so we can read the log file and load the best schedules.
     Evaluate inference time cost...
     Execution time summary:
      mean (ms)   median (ms)    max (ms)     min (ms)     std (ms)  
-      734.4810     734.0194     736.5202     732.9033      1.5122                  
+      701.7361     700.7611     704.0504     700.3969      1.6431                  
 
 
 
@@ -682,7 +682,7 @@ Other Tips
 
 .. rst-class:: sphx-glr-timing
 
-   **Total running time of the script:** ( 1 minutes  32.074 seconds)
+   **Total running time of the script:** ( 1 minutes  28.951 seconds)
 
 
 .. _sphx_glr_download_how_to_tune_with_autoscheduler_tune_network_x86.py:
diff --git a/docs/_sources/how_to/tune_with_autotvm/sg_execution_times.rst.txt b/docs/_sources/how_to/tune_with_autotvm/sg_execution_times.rst.txt
index 913c50fd0c..b45261fb1d 100644
--- a/docs/_sources/how_to/tune_with_autotvm/sg_execution_times.rst.txt
+++ b/docs/_sources/how_to/tune_with_autotvm/sg_execution_times.rst.txt
@@ -5,10 +5,10 @@
 
 Computation times
 =================
-**00:23.486** total execution time for **how_to_tune_with_autotvm** files:
+**00:22.820** total execution time for **how_to_tune_with_autotvm** files:
 
 +--------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_tune_with_autotvm_tune_conv2d_cuda.py` (``tune_conv2d_cuda.py``)           | 00:23.450 | 0.0 MB |
+| :ref:`sphx_glr_how_to_tune_with_autotvm_tune_conv2d_cuda.py` (``tune_conv2d_cuda.py``)           | 00:22.784 | 0.0 MB |
 +--------------------------------------------------------------------------------------------------+-----------+--------+
 | :ref:`sphx_glr_how_to_tune_with_autotvm_tune_relay_x86.py` (``tune_relay_x86.py``)               | 00:00.021 | 0.0 MB |
 +--------------------------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/how_to/tune_with_autotvm/tune_conv2d_cuda.rst.txt b/docs/_sources/how_to/tune_with_autotvm/tune_conv2d_cuda.rst.txt
index 1688768955..9f99be4af5 100644
--- a/docs/_sources/how_to/tune_with_autotvm/tune_conv2d_cuda.rst.txt
+++ b/docs/_sources/how_to/tune_with_autotvm/tune_conv2d_cuda.rst.txt
@@ -326,7 +326,7 @@ and measure running time.
 
     Best config:
     ,None
-    Time cost of this operator: 0.037160
+    Time cost of this operator: 0.037194
 
 
 
diff --git a/docs/_sources/how_to/work_with_microtvm/micro_autotune.rst.txt b/docs/_sources/how_to/work_with_microtvm/micro_autotune.rst.txt
index 6f949b67dc..859fd15a83 100644
--- a/docs/_sources/how_to/work_with_microtvm/micro_autotune.rst.txt
+++ b/docs/_sources/how_to/work_with_microtvm/micro_autotune.rst.txt
@@ -360,10 +360,10 @@ Timing the untuned program
     ########## Build without Autotuning ##########
     Node Name                                     Ops                                           Time(us)  Time(%)  Shape              Inputs  Outputs  Measurements(us)  
     ---------                                     ---                                           --------  -------  -----              ------  -------  ----------------  
-    tvmgen_default_fused_nn_contrib_conv2d_NCHWc  tvmgen_default_fused_nn_contrib_conv2d_NCHWc  296.0     98.705   (1, 2, 10, 10, 3)  2       1        [296.0]           
-    tvmgen_default_fused_layout_transform_1       tvmgen_default_fused_layout_transform_1       2.941     0.981    (1, 6, 10, 10)     1       1        [2.941]           
-    tvmgen_default_fused_layout_transform         tvmgen_default_fused_layout_transform         0.943     0.315    (1, 1, 10, 10, 3)  1       1        [0.943]           
-    Total_time                                    -                                             299.885   -        -                  -       -        -                 
+    tvmgen_default_fused_nn_contrib_conv2d_NCHWc  tvmgen_default_fused_nn_contrib_conv2d_NCHWc  305.1     98.768   (1, 2, 10, 10, 3)  2       1        [305.1]           
+    tvmgen_default_fused_layout_transform_1       tvmgen_default_fused_layout_transform_1       2.868     0.928    (1, 6, 10, 10)     1       1        [2.868]           
+    tvmgen_default_fused_layout_transform         tvmgen_default_fused_layout_transform         0.937     0.303    (1, 1, 10, 10, 3)  1       1        [0.937]           
+    Total_time                                    -                                             308.905   -        -                  -       -        -                 
 
 
 
@@ -428,10 +428,10 @@ Timing the tuned program
     ########## Build with Autotuning ##########
     Node Name                                     Ops                                           Time(us)  Time(%)  Shape              Inputs  Outputs  Measurements(us)  
     ---------                                     ---                                           --------  -------  -----              ------  -------  ----------------  
-    tvmgen_default_fused_nn_contrib_conv2d_NCHWc  tvmgen_default_fused_nn_contrib_conv2d_NCHWc  130.9     98.01    (1, 6, 10, 10, 1)  2       1        [130.9]           
-    tvmgen_default_fused_layout_transform_1       tvmgen_default_fused_layout_transform_1       1.714     1.283    (1, 6, 10, 10)     1       1        [1.714]           
-    tvmgen_default_fused_layout_transform         tvmgen_default_fused_layout_transform         0.943     0.706    (1, 1, 10, 10, 3)  1       1        [0.943]           
-    Total_time                                    -                                             133.557   -        -                  -       -        -                 
+    tvmgen_default_fused_nn_contrib_conv2d_NCHWc  tvmgen_default_fused_nn_contrib_conv2d_NCHWc  129.8     98.006   (1, 6, 10, 10, 1)  2       1        [129.8]           
+    tvmgen_default_fused_layout_transform_1       tvmgen_default_fused_layout_transform_1       1.717     1.297    (1, 6, 10, 10)     1       1        [1.717]           
+    tvmgen_default_fused_layout_transform         tvmgen_default_fused_layout_transform         0.924     0.697    (1, 1, 10, 10, 3)  1       1        [0.924]           
+    Total_time                                    -                                             132.441   -        -                  -       -        -                 
 
 
 
@@ -439,7 +439,7 @@ Timing the tuned program
 
 .. rst-class:: sphx-glr-timing
 
-   **Total running time of the script:** ( 1 minutes  24.970 seconds)
+   **Total running time of the script:** ( 1 minutes  22.628 seconds)
 
 
 .. _sphx_glr_download_how_to_work_with_microtvm_micro_autotune.py:
diff --git a/docs/_sources/how_to/work_with_microtvm/micro_pytorch.rst.txt b/docs/_sources/how_to/work_with_microtvm/micro_pytorch.rst.txt
index 5672fcd0ec..f935903142 100644
--- a/docs/_sources/how_to/work_with_microtvm/micro_pytorch.rst.txt
+++ b/docs/_sources/how_to/work_with_microtvm/micro_pytorch.rst.txt
@@ -118,7 +118,7 @@ download a cat image and preprocess it to use as the model input.
     /venv/apache-tvm-py3.8/lib/python3.8/site-packages/torch/ao/quantization/utils.py:310: UserWarning: must run observer before calling calculate_qparams. Returning default values.
       warnings.warn(
     Downloading: "https://download.pytorch.org/models/quantized/mobilenet_v2_qnnpack_37f702c5.pth" to /workspace/.cache/torch/hub/checkpoints/mobilenet_v2_qnnpack_37f702c5.pth
-       0%|          | 0.00/3.42M [00:00<?, ?B/s]      61%|######    | 2.09M/3.42M [00:00<00:00, 15.8MB/s]     100%|##########| 3.42M/3.42M [00:00<00:00, 25.0MB/s]
+       0%|          | 0.00/3.42M [00:00<?, ?B/s]      61%|######    | 2.09M/3.42M [00:00<00:00, 11.8MB/s]     100%|##########| 3.42M/3.42M [00:00<00:00, 18.3MB/s]
     /venv/apache-tvm-py3.8/lib/python3.8/site-packages/torch/_utils.py:314: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly.  To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
       device=storage.device,
     /workspace/python/tvm/relay/frontend/pytorch_utils.py:47: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
@@ -326,7 +326,7 @@ Look up prediction top 1 index in 1000 class synset.
 
 .. rst-class:: sphx-glr-timing
 
-   **Total running time of the script:** ( 1 minutes  27.905 seconds)
+   **Total running time of the script:** ( 1 minutes  24.486 seconds)
 
 
 .. _sphx_glr_download_how_to_work_with_microtvm_micro_pytorch.py:
diff --git a/docs/_sources/how_to/work_with_microtvm/micro_train.rst.txt b/docs/_sources/how_to/work_with_microtvm/micro_train.rst.txt
index 02a3de0aa7..119e5eaadd 100644
--- a/docs/_sources/how_to/work_with_microtvm/micro_train.rst.txt
+++ b/docs/_sources/how_to/work_with_microtvm/micro_train.rst.txt
@@ -217,7 +217,7 @@ take about **2 minutes** to download the Stanford Cars, while COCO 2017 validati
  .. code-block:: none
 
 
-    '/tmp/tmpmvn13rdm/images/random'
+    '/tmp/tmpvp040j6r/images/random'
 
 
 
@@ -317,8 +317,8 @@ objects to other stuff? We can display some examples from our datasets using ``m
 
  .. code-block:: none
 
-    /tmp/tmpmvn13rdm/images/target contains 8144 images
-    /tmp/tmpmvn13rdm/images/random contains 5000 images
+    /tmp/tmpvp040j6r/images/target contains 8144 images
+    /tmp/tmpvp040j6r/images/random contains 5000 images
 
 
 
@@ -493,13 +493,13 @@ the time on our validation set).
  .. code-block:: none
 
     Epoch 1/3
-    328/328 - 38s - loss: 0.2239 - accuracy: 0.9251 - val_loss: 0.1174 - val_accuracy: 0.9611 - 38s/epoch - 116ms/step
+    328/328 - 38s - loss: 0.2149 - accuracy: 0.9268 - val_loss: 0.1321 - val_accuracy: 0.9528 - 38s/epoch - 117ms/step
     Epoch 2/3
-    328/328 - 34s - loss: 0.1036 - accuracy: 0.9608 - val_loss: 0.1293 - val_accuracy: 0.9468 - 34s/epoch - 105ms/step
+    328/328 - 34s - loss: 0.1025 - accuracy: 0.9633 - val_loss: 0.1075 - val_accuracy: 0.9660 - 34s/epoch - 104ms/step
     Epoch 3/3
-    328/328 - 34s - loss: 0.0704 - accuracy: 0.9746 - val_loss: 0.1212 - val_accuracy: 0.9603 - 34s/epoch - 105ms/step
+    328/328 - 34s - loss: 0.0693 - accuracy: 0.9735 - val_loss: 0.1036 - val_accuracy: 0.9671 - 34s/epoch - 104ms/step
 
-    <keras.callbacks.History object at 0x7f046de62760>
+    <keras.callbacks.History object at 0x7f8f73b4a8e0>
 
 
 
@@ -860,7 +860,7 @@ Arduino tutorial for how to do that `on GitHub <https://github.com/guberti/tvm-a
 
 .. rst-class:: sphx-glr-timing
 
-   **Total running time of the script:** ( 4 minutes  48.012 seconds)
+   **Total running time of the script:** ( 4 minutes  45.515 seconds)
 
 
 .. _sphx_glr_download_how_to_work_with_microtvm_micro_train.py:
diff --git a/docs/_sources/how_to/work_with_microtvm/sg_execution_times.rst.txt b/docs/_sources/how_to/work_with_microtvm/sg_execution_times.rst.txt
index c0359c1e4f..395155f46f 100644
--- a/docs/_sources/how_to/work_with_microtvm/sg_execution_times.rst.txt
+++ b/docs/_sources/how_to/work_with_microtvm/sg_execution_times.rst.txt
@@ -5,24 +5,24 @@
 
 Computation times
 =================
-**08:09.140** total execution time for **how_to_work_with_microtvm** files:
+**08:00.439** total execution time for **how_to_work_with_microtvm** files:
 
 +-----------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_microtvm_micro_train.py` (``micro_train.py``)           | 04:48.012 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_microtvm_micro_train.py` (``micro_train.py``)           | 04:45.515 | 0.0 MB |
 +-----------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_microtvm_micro_pytorch.py` (``micro_pytorch.py``)       | 01:27.905 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_microtvm_micro_pytorch.py` (``micro_pytorch.py``)       | 01:24.486 | 0.0 MB |
 +-----------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_microtvm_micro_autotune.py` (``micro_autotune.py``)     | 01:24.970 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_microtvm_micro_autotune.py` (``micro_autotune.py``)     | 01:22.628 | 0.0 MB |
 +-----------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_microtvm_micro_aot.py` (``micro_aot.py``)               | 00:11.857 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_microtvm_micro_aot.py` (``micro_aot.py``)               | 00:11.454 | 0.0 MB |
 +-----------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_microtvm_micro_custom_ide.py` (``micro_custom_ide.py``) | 00:08.907 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_microtvm_micro_custom_ide.py` (``micro_custom_ide.py``) | 00:08.370 | 0.0 MB |
 +-----------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_microtvm_micro_tflite.py` (``micro_tflite.py``)         | 00:07.490 | 0.0 MB |
-+-----------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_microtvm_micro_tvmc.py` (``micro_tvmc.py``)             | 00:00.000 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_microtvm_micro_tflite.py` (``micro_tflite.py``)         | 00:07.987 | 0.0 MB |
 +-----------------------------------------------------------------------------------------+-----------+--------+
 | :ref:`sphx_glr_how_to_work_with_microtvm_micro_ethosu.py` (``micro_ethosu.py``)         | 00:00.000 | 0.0 MB |
 +-----------------------------------------------------------------------------------------+-----------+--------+
+| :ref:`sphx_glr_how_to_work_with_microtvm_micro_tvmc.py` (``micro_tvmc.py``)             | 00:00.000 | 0.0 MB |
++-----------------------------------------------------------------------------------------+-----------+--------+
 | :ref:`sphx_glr_how_to_work_with_microtvm_micro_mlperftiny.py` (``micro_mlperftiny.py``) | 00:00.000 | 0.0 MB |
 +-----------------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/how_to/work_with_relay/sg_execution_times.rst.txt b/docs/_sources/how_to/work_with_relay/sg_execution_times.rst.txt
index e6f0a606ca..26d8c6585b 100644
--- a/docs/_sources/how_to/work_with_relay/sg_execution_times.rst.txt
+++ b/docs/_sources/how_to/work_with_relay/sg_execution_times.rst.txt
@@ -5,14 +5,14 @@
 
 Computation times
 =================
-**00:38.098** total execution time for **how_to_work_with_relay** files:
+**00:37.291** total execution time for **how_to_work_with_relay** files:
 
 +----------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_relay_using_pipeline_executor.py` (``using_pipeline_executor.py``) | 00:32.948 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_relay_using_pipeline_executor.py` (``using_pipeline_executor.py``) | 00:32.255 | 0.0 MB |
 +----------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_relay_using_external_lib.py` (``using_external_lib.py``)           | 00:03.254 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_relay_using_external_lib.py` (``using_external_lib.py``)           | 00:03.137 | 0.0 MB |
 +----------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_relay_build_gcn.py` (``build_gcn.py``)                             | 00:01.890 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_relay_build_gcn.py` (``build_gcn.py``)                             | 00:01.892 | 0.0 MB |
 +----------------------------------------------------------------------------------------------------+-----------+--------+
 | :ref:`sphx_glr_how_to_work_with_relay_using_relay_viz.py` (``using_relay_viz.py``)                 | 00:00.007 | 0.0 MB |
 +----------------------------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/how_to/work_with_schedules/intrin_math.rst.txt b/docs/_sources/how_to/work_with_schedules/intrin_math.rst.txt
index dd737fd0e0..70be358a9d 100644
--- a/docs/_sources/how_to/work_with_schedules/intrin_math.rst.txt
+++ b/docs/_sources/how_to/work_with_schedules/intrin_math.rst.txt
@@ -281,7 +281,7 @@ The following example customizes CUDA lowering rule for :code:`exp`.
  .. code-block:: none
 
 
-    <function my_cuda_math_rule at 0x7f064815eee0>
+    <function my_cuda_math_rule at 0x7f914a1abdc0>
 
 
 
diff --git a/docs/_sources/how_to/work_with_schedules/sg_execution_times.rst.txt b/docs/_sources/how_to/work_with_schedules/sg_execution_times.rst.txt
index a5bcc47fb8..e07dd78ea0 100644
--- a/docs/_sources/how_to/work_with_schedules/sg_execution_times.rst.txt
+++ b/docs/_sources/how_to/work_with_schedules/sg_execution_times.rst.txt
@@ -5,22 +5,22 @@
 
 Computation times
 =================
-**00:05.432** total execution time for **how_to_work_with_schedules** files:
+**00:05.466** total execution time for **how_to_work_with_schedules** files:
 
 +------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_schedules_intrin_math.py` (``intrin_math.py``)                 | 00:02.496 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_schedules_intrin_math.py` (``intrin_math.py``)                 | 00:02.525 | 0.0 MB |
 +------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_schedules_tensorize.py` (``tensorize.py``)                     | 00:01.246 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_schedules_tensorize.py` (``tensorize.py``)                     | 00:01.241 | 0.0 MB |
 +------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_schedules_reduction.py` (``reduction.py``)                     | 00:00.709 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_schedules_reduction.py` (``reduction.py``)                     | 00:00.724 | 0.0 MB |
 +------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_schedules_scan.py` (``scan.py``)                               | 00:00.698 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_schedules_scan.py` (``scan.py``)                               | 00:00.700 | 0.0 MB |
 +------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_schedules_extern_op.py` (``extern_op.py``)                     | 00:00.118 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_schedules_extern_op.py` (``extern_op.py``)                     | 00:00.114 | 0.0 MB |
 +------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_schedules_schedule_primitives.py` (``schedule_primitives.py``) | 00:00.069 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_schedules_schedule_primitives.py` (``schedule_primitives.py``) | 00:00.068 | 0.0 MB |
 +------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_schedules_tedd.py` (``tedd.py``)                               | 00:00.066 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_schedules_tedd.py` (``tedd.py``)                               | 00:00.064 | 0.0 MB |
 +------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_how_to_work_with_schedules_tuple_inputs.py` (``tuple_inputs.py``)               | 00:00.031 | 0.0 MB |
+| :ref:`sphx_glr_how_to_work_with_schedules_tuple_inputs.py` (``tuple_inputs.py``)               | 00:00.030 | 0.0 MB |
 +------------------------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/topic/vta/tutorials/frontend/deploy_detection.rst.txt b/docs/_sources/topic/vta/tutorials/frontend/deploy_detection.rst.txt
index 9ee5f74b1b..9c5bd59e6e 100644
--- a/docs/_sources/topic/vta/tutorials/frontend/deploy_detection.rst.txt
+++ b/docs/_sources/topic/vta/tutorials/frontend/deploy_detection.rst.txt
@@ -337,7 +337,7 @@ The compilation steps are:
 
     /workspace/python/tvm/relay/build_module.py:345: DeprecationWarning: Please use input parameter mod (tvm.IRModule) instead of deprecated parameter mod (tvm.relay.function.Function)
       warnings.warn(
-    yolov3-tiny inference graph built in 25.97s!
+    yolov3-tiny inference graph built in 24.99s!
 
 
 
diff --git a/docs/_sources/topic/vta/tutorials/frontend/sg_execution_times.rst.txt b/docs/_sources/topic/vta/tutorials/frontend/sg_execution_times.rst.txt
index 84e4f4da53..81aa9a56ad 100644
--- a/docs/_sources/topic/vta/tutorials/frontend/sg_execution_times.rst.txt
+++ b/docs/_sources/topic/vta/tutorials/frontend/sg_execution_times.rst.txt
@@ -5,8 +5,8 @@
 
 Computation times
 =================
-**00:58.720** total execution time for **topic_vta_tutorials_frontend** files:
+**00:57.499** total execution time for **topic_vta_tutorials_frontend** files:
 
 +--------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_topic_vta_tutorials_frontend_deploy_detection.py` (``deploy_detection.py``) | 00:58.720 | 0.0 MB |
+| :ref:`sphx_glr_topic_vta_tutorials_frontend_deploy_detection.py` (``deploy_detection.py``) | 00:57.499 | 0.0 MB |
 +--------------------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/topic/vta/tutorials/optimize/sg_execution_times.rst.txt b/docs/_sources/topic/vta/tutorials/optimize/sg_execution_times.rst.txt
index e54c0b71c2..d9463cebc7 100644
--- a/docs/_sources/topic/vta/tutorials/optimize/sg_execution_times.rst.txt
+++ b/docs/_sources/topic/vta/tutorials/optimize/sg_execution_times.rst.txt
@@ -5,10 +5,10 @@
 
 Computation times
 =================
-**00:03.215** total execution time for **topic_vta_tutorials_optimize** files:
+**00:03.224** total execution time for **topic_vta_tutorials_optimize** files:
 
 +--------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_topic_vta_tutorials_optimize_convolution_opt.py` (``convolution_opt.py``)         | 00:02.722 | 0.0 MB |
+| :ref:`sphx_glr_topic_vta_tutorials_optimize_convolution_opt.py` (``convolution_opt.py``)         | 00:02.734 | 0.0 MB |
 +--------------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_topic_vta_tutorials_optimize_matrix_multiply_opt.py` (``matrix_multiply_opt.py``) | 00:00.493 | 0.0 MB |
+| :ref:`sphx_glr_topic_vta_tutorials_optimize_matrix_multiply_opt.py` (``matrix_multiply_opt.py``) | 00:00.490 | 0.0 MB |
 +--------------------------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/topic/vta/tutorials/sg_execution_times.rst.txt b/docs/_sources/topic/vta/tutorials/sg_execution_times.rst.txt
index 48739f5ccf..b6b02a15d7 100644
--- a/docs/_sources/topic/vta/tutorials/sg_execution_times.rst.txt
+++ b/docs/_sources/topic/vta/tutorials/sg_execution_times.rst.txt
@@ -5,10 +5,10 @@
 
 Computation times
 =================
-**00:00.814** total execution time for **topic_vta_tutorials** files:
+**00:00.809** total execution time for **topic_vta_tutorials** files:
 
 +---------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_topic_vta_tutorials_matrix_multiply.py` (``matrix_multiply.py``) | 00:00.421 | 0.0 MB |
+| :ref:`sphx_glr_topic_vta_tutorials_matrix_multiply.py` (``matrix_multiply.py``) | 00:00.419 | 0.0 MB |
 +---------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_topic_vta_tutorials_vta_get_started.py` (``vta_get_started.py``) | 00:00.393 | 0.0 MB |
+| :ref:`sphx_glr_topic_vta_tutorials_vta_get_started.py` (``vta_get_started.py``) | 00:00.391 | 0.0 MB |
 +---------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/tutorial/auto_scheduler_matmul_x86.rst.txt b/docs/_sources/tutorial/auto_scheduler_matmul_x86.rst.txt
index 685c7d97e1..27dc094beb 100644
--- a/docs/_sources/tutorial/auto_scheduler_matmul_x86.rst.txt
+++ b/docs/_sources/tutorial/auto_scheduler_matmul_x86.rst.txt
@@ -207,13 +207,6 @@ trials, we can load the best schedule from the log file and apply it.
 
 
 
-.. rst-class:: sphx-glr-script-out
-
- .. code-block:: none
-
-
-    *E
-
 
 
 
@@ -325,7 +318,7 @@ We build the binary and check its correctness and performance.
 
  .. code-block:: none
 
-    Execution time of this operator: 94.584 ms
+    Execution time of this operator: 91.374 ms
 
 
 
@@ -441,7 +434,7 @@ operations.
 
 .. rst-class:: sphx-glr-timing
 
-   **Total running time of the script:** ( 1 minutes  31.186 seconds)
+   **Total running time of the script:** ( 1 minutes  30.520 seconds)
 
 
 .. _sphx_glr_download_tutorial_auto_scheduler_matmul_x86.py:
diff --git a/docs/_sources/tutorial/autotvm_matmul_x86.rst.txt b/docs/_sources/tutorial/autotvm_matmul_x86.rst.txt
index 8c4ba42d5e..d5d0dfe989 100644
--- a/docs/_sources/tutorial/autotvm_matmul_x86.rst.txt
+++ b/docs/_sources/tutorial/autotvm_matmul_x86.rst.txt
@@ -454,16 +454,16 @@ reduce variance, we take 5 measurements and average them.
     waiting for device...
     device available
     Get devices for measurement successfully!
-    No: 1   GFLOPS: 1.51/1.51       result: MeasureResult(costs=(0.17835317460000003,), error_no=MeasureErrorNo.NO_ERROR, all_cost=3.10009503364563, timestamp=1708112752.993949)   [('tile_y', [-1, 1]), ('tile_x', [-1, 1])],None,0
-    No: 2   GFLOPS: 11.34/11.34     result: MeasureResult(costs=(0.0236654322,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6512999534606934, timestamp=1708112753.635269)        [('tile_y', [-1, 128]), ('tile_x', [-1, 32])],None,57
-    No: 3   GFLOPS: 3.86/11.34      result: MeasureResult(costs=(0.0694950984,), error_no=MeasureErrorNo.NO_ERROR, all_cost=1.3890049457550049, timestamp=1708112755.0241795)       [('tile_y', [-1, 4]), ('tile_x', [-1, 2])],None,12
-    No: 4   GFLOPS: 12.59/12.59     result: MeasureResult(costs=(0.021325845599999997,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6152889728546143, timestamp=1708112755.6300626)       [('tile_y', [-1, 32]), ('tile_x', [-1, 128])],None,75
-    No: 5   GFLOPS: 10.53/12.59     result: MeasureResult(costs=(0.0255028956,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.724308967590332, timestamp=1708112756.465089) [('tile_y', [-1, 1]), ('tile_x', [-1, 64])],None,60
-    No: 6   GFLOPS: 3.65/12.59      result: MeasureResult(costs=(0.0735273582,), error_no=MeasureErrorNo.NO_ERROR, all_cost=1.4519906044006348, timestamp=1708112757.901221)        [('tile_y', [-1, 64]), ('tile_x', [-1, 8])],None,36
-    No: 7   GFLOPS: 7.74/12.59      result: MeasureResult(costs=(0.0346974656,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.8124942779541016, timestamp=1708112758.7230752)       [('tile_y', [-1, 512]), ('tile_x', [-1, 32])],None,59
-    No: 8   GFLOPS: 5.78/12.59      result: MeasureResult(costs=(0.046433825,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.9909617900848389, timestamp=1708112759.725885) [('tile_y', [-1, 1]), ('tile_x', [-1, 4])],None,20
-    No: 9   GFLOPS: 11.01/12.59     result: MeasureResult(costs=(0.0243856706,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6288285255432129, timestamp=1708112760.4636207)       [('tile_y', [-1, 64]), ('tile_x', [-1, 256])],None,86
-    No: 10  GFLOPS: 2.02/12.59      result: MeasureResult(costs=(0.133171552,), error_no=MeasureErrorNo.NO_ERROR, all_cost=2.3537261486053467, timestamp=1708112762.8518498)        [('tile_y', [-1, 64]), ('tile_x', [-1, 4])],None,26
+    No: 1   GFLOPS: 9.13/9.13       result: MeasureResult(costs=(0.0293897064,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.71429443359375, timestamp=1708122324.5400858) [('tile_y', [-1, 1]), ('tile_x', [-1, 32])],None,50
+    No: 2   GFLOPS: 11.23/11.23     result: MeasureResult(costs=(0.0239100742,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.7238938808441162, timestamp=1708122325.1927767)       [('tile_y', [-1, 1]), ('tile_x', [-1, 128])],None,70
+    No: 3   GFLOPS: 13.73/13.73     result: MeasureResult(costs=(0.0195579744,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6750574111938477, timestamp=1708122325.7714353)       [('tile_y', [-1, 256]), ('tile_x', [-1, 128])],None,78
+    No: 4   GFLOPS: 13.77/13.77     result: MeasureResult(costs=(0.0195008208,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.5940024852752686, timestamp=1708122326.3449867)       [('tile_y', [-1, 8]), ('tile_x', [-1, 256])],None,83
+    No: 5   GFLOPS: 10.60/13.77     result: MeasureResult(costs=(0.025329714000000003,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6626152992248535, timestamp=1708122327.1872604)       [('tile_y', [-1, 4]), ('tile_x', [-1, 32])],None,52
+    No: 6   GFLOPS: 12.22/13.77     result: MeasureResult(costs=(0.0219690712,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6191966533660889, timestamp=1708122327.7918754)       [('tile_y', [-1, 16]), ('tile_x', [-1, 16])],None,44
+    No: 7   GFLOPS: 9.28/13.77      result: MeasureResult(costs=(0.0289150848,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.724522590637207, timestamp=1708122328.505882) [('tile_y', [-1, 256]), ('tile_x', [-1, 16])],None,48
+    No: 8   GFLOPS: 9.59/13.77      result: MeasureResult(costs=(0.028001969200000004,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.7735788822174072, timestamp=1708122329.2077875)       [('tile_y', [-1, 512]), ('tile_x', [-1, 64])],None,69
+    No: 9   GFLOPS: 13.80/13.80     result: MeasureResult(costs=(0.0194582234,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.5858185291290283, timestamp=1708122329.90334) [('tile_y', [-1, 128]), ('tile_x', [-1, 128])],None,77
+    No: 10  GFLOPS: 9.96/13.80      result: MeasureResult(costs=(0.026938249000000004,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6510946750640869, timestamp=1708122330.585345)        [('tile_y', [-1, 4]), ('tile_x', [-1, 16])],None,42
 
 
 
diff --git a/docs/_sources/tutorial/autotvm_relay_x86.rst.txt b/docs/_sources/tutorial/autotvm_relay_x86.rst.txt
index af6b8c5ecf..cc6e16e804 100644
--- a/docs/_sources/tutorial/autotvm_relay_x86.rst.txt
+++ b/docs/_sources/tutorial/autotvm_relay_x86.rst.txt
@@ -311,7 +311,7 @@ standard deviation.
 
  .. code-block:: none
 
-    {'mean': 492.71140236000065, 'median': 492.71483809998244, 'std': 3.6515681597064877}
+    {'mean': 466.5144750900072, 'median': 466.5101404500092, 'std': 1.2822160389899164}
 
 
 
@@ -582,30 +582,31 @@ the tuning data to.
 
  .. code-block:: none
 
-     [Task  1/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  1/25]  Current/Best:   10.92/  23.17 GFLOPS | Progress: (4/20) | 9.02 s     [Task  1/25]  Current/Best:    9.70/  23.17 GFLOPS | Progress: (8/20) | 11.45 s     [Task  1/25]  Current/Best:   12.83/  23.17 GFLOPS | Progress: (12/20) | 14.44 s     [Task  1/25]  Current/Best:    3.33/  23.58 GFLOPS | Progress: (16/20) | 17.31 s     [Task  1/25]  Current/Best:   10.67/  23.58 GFLOPS | Progress: (20/20) | 22.19 s Done.
-     [Task  2/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  2/25]  Current/Best:   15.66/  17.36 GFLOPS | Progress: (4/20) | 4.21 s     [Task  2/25]  Current/Best:    7.19/  17.36 GFLOPS | Progress: (8/20) | 5.92 s     [Task  2/25]  Current/Best:    3.39/  17.36 GFLOPS | Progress: (12/20) | 7.81 s     [Task  2/25]  Current/Best:    7.31/  17.36 GFLOPS | Progress: (16/20) | 9.32 s     [Task  2/25]  Current/Best:   17.71/  17.71 GFLOPS | Progress: (20/20) | 10.90 s Done.
-     [Task  3/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  3/25]  Current/Best:    1.62/  19.44 GFLOPS | Progress: (4/20) | 7.61 s     [Task  3/25]  Current/Best:   19.37/  24.26 GFLOPS | Progress: (8/20) | 10.96 s     [Task  3/25]  Current/Best:   17.35/  24.26 GFLOPS | Progress: (12/20) | 13.65 s     [Task  3/25]  Current/Best:    6.41/  24.26 GFLOPS | Progress: (16/20) | 16.03 s     [Task  3/25]  Current/Best:   11.35/  24.26 GFLOPS | Progress: (20/20) | 18.99 s Done.
-     [Task  4/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  4/25]  Current/Best:   15.56/  19.73 GFLOPS | Progress: (4/20) | 4.56 s     [Task  4/25]  Current/Best:   14.50/  19.73 GFLOPS | Progress: (8/20) | 7.26 s     [Task  4/25]  Current/Best:   10.23/  19.73 GFLOPS | Progress: (12/20) | 9.42 s     [Task  4/25]  Current/Best:   14.39/  19.73 GFLOPS | Progress: (16/20) | 13.60 s     [Task  4/25]  Current/Best:   13.08/  19.73 GFLOPS | Progress: (20/20) | 17.03 s Done.
-     [Task  5/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  5/25]  Current/Best:    8.00/  14.23 GFLOPS | Progress: (4/20) | 4.58 s     [Task  5/25]  Current/Best:    1.50/  15.63 GFLOPS | Progress: (8/20) | 7.15 s     [Task  5/25]  Current/Best:   17.90/  17.90 GFLOPS | Progress: (12/20) | 9.19 s     [Task  5/25]  Current/Best:    3.20/  21.82 GFLOPS | Progress: (16/20) | 12.40 s     [Task  5/25]  Current/Best:   13.78/  21.82 GFLOPS | Progress: (20/20) | 14.91 s Done.
-     [Task  6/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  6/25]  Current/Best:   11.34/  13.74 GFLOPS | Progress: (4/20) | 5.42 s     [Task  6/25]  Current/Best:   22.03/  22.03 GFLOPS | Progress: (8/20) | 7.62 s     [Task  6/25]  Current/Best:   15.98/  22.03 GFLOPS | Progress: (12/20) | 9.84 s     [Task  6/25]  Current/Best:    2.42/  22.03 GFLOPS | Progress: (16/20) | 12.74 s     [Task  6/25]  Current/Best:   14.97/  22.03 GFLOPS | Progress: (20/20) | 16.15 s Done.
-     [Task  7/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  7/25]  Current/Best:    3.14/  21.01 GFLOPS | Progress: (4/20) | 5.61 s     [Task  7/25]  Current/Best:   14.91/  21.01 GFLOPS | Progress: (8/20) | 9.35 s     [Task  7/25]  Current/Best:   12.45/  21.01 GFLOPS | Progress: (12/20) | 12.35 s     [Task  7/25]  Current/Best:   20.32/  21.01 GFLOPS | Progress: (16/20) | 15.01 s     [Task  7/25]  Current/Best:   11.14/  21.01 GFLOPS | Progress: (20/20) | 17.31 s Done.
-     [Task  8/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  8/25]  Current/Best:   14.71/  15.26 GFLOPS | Progress: (4/20) | 4.99 s     [Task  8/25]  Current/Best:    2.89/  15.26 GFLOPS | Progress: (8/20) | 8.13 s     [Task  8/25]  Current/Best:   12.13/  17.63 GFLOPS | Progress: (12/20) | 11.48 s     [Task  8/25]  Current/Best:    9.61/  17.63 GFLOPS | Progress: (16/20) | 17.56 s     [Task  8/25]  Current/Best:   10.39/  17.63 GFLOPS | Progress: (20/20) | 22.98 s Done.
-     [Task  9/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  9/25]  Current/Best:   11.39/  16.74 GFLOPS | Progress: (4/20) | 4.93 s     [Task  9/25]  Current/Best:    9.20/  16.74 GFLOPS | Progress: (8/20) | 8.18 s     [Task  9/25]  Current/Best:   12.51/  16.74 GFLOPS | Progress: (12/20) | 15.73 s     [Task  9/25]  Current/Best:   18.13/  19.73 GFLOPS | Progress: (16/20) | 18.36 s     [Task  9/25]  Current/Best:    8.51/  19.73 GFLOPS | Progress: (20/20) | 26.27 s Done.
-     [Task 10/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 10/25]  Current/Best:   13.88/  14.79 GFLOPS | Progress: (4/20) | 4.77 s     [Task 10/25]  Current/Best:   10.11/  14.79 GFLOPS | Progress: (8/20) | 7.36 s     [Task 10/25]  Current/Best:    3.97/  17.30 GFLOPS | Progress: (12/20) | 9.42 s     [Task 10/25]  Current/Best:   18.43/  18.43 GFLOPS | Progress: (16/20) | 11.40 s     [Task 10/25]  Current/Best:   18.30/  18.43 GFLOPS | Progress: (20/20) | 14.24 s Done.
-     [Task 11/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 11/25]  Current/Best:   18.33/  18.33 GFLOPS | Progress: (4/20) | 5.54 s     [Task 11/25]  Current/Best:   20.07/  20.07 GFLOPS | Progress: (8/20) | 7.98 s     [Task 11/25]  Current/Best:   21.14/  21.14 GFLOPS | Progress: (12/20) | 10.34 s     [Task 11/25]  Current/Best:   19.04/  21.14 GFLOPS | Progress: (16/20) | 12.96 s     [Task 11/25]  Current/Best:    6.62/  21.14 GFLOPS | Progress: (20/20) | 15.31 s Done.
-     [Task 12/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 12/25]  Current/Best:   15.74/  16.61 GFLOPS | Progress: (4/20) | 4.99 s     [Task 12/25]  Current/Best:    8.15/  16.61 GFLOPS | Progress: (8/20) | 9.85 s     [Task 12/25]  Current/Best:   12.66/  16.61 GFLOPS | Progress: (12/20) | 12.90 s     [Task 12/25]  Current/Best:   18.72/  21.36 GFLOPS | Progress: (16/20) | 16.98 s     [Task 12/25]  Current/Best:   18.70/  21.36 GFLOPS | Progress: (20/20) | 18.93 s Done.
-     [Task 13/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 13/25]  Current/Best:   12.08/  21.32 GFLOPS | Progress: (4/20) | 4.91 s     [Task 13/25]  Current/Best:   11.79/  21.32 GFLOPS | Progress: (8/20) | 8.05 s     [Task 13/25]  Current/Best:    7.07/  21.32 GFLOPS | Progress: (12/20) | 10.37 s     [Task 13/25]  Current/Best:   19.25/  21.32 GFLOPS | Progress: (16/20) | 13.13 s     [Task 13/25]  Current/Best:   18.85/  21.32 GFLOPS | Progress: (20/20) | 18.32 s Done.
-     [Task 14/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 14/25]  Current/Best:   12.44/  12.44 GFLOPS | Progress: (4/20) | 12.70 s     [Task 14/25]  Current/Best:    1.61/  16.43 GFLOPS | Progress: (8/20) | 20.28 s     [Task 14/25]  Current/Best:   16.91/  18.81 GFLOPS | Progress: (12/20) | 22.11 s     [Task 14/25]  Current/Best:    9.50/  18.81 GFLOPS | Progress: (16/20) | 25.52 s     [Task 14/25]  Current/Best:   19.60/  19.60 GFLOPS | Progress: (20/20) | 33.13 s Done.
-     [Task 15/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 15/25]  Current/Best:    7.86/  14.38 GFLOPS | Progress: (4/20) | 7.21 s     [Task 15/25]  Current/Best:   16.73/  21.04 GFLOPS | Progress: (8/20) | 9.70 s     [Task 15/25]  Current/Best:   13.61/  21.04 GFLOPS | Progress: (12/20) | 11.57 s     [Task 15/25]  Current/Best:   22.30/  22.30 GFLOPS | Progress: (16/20) | 13.06 s     [Task 15/25]  Current/Best:   22.70/  23.29 GFLOPS | Progress: (20/20) | 14.80 s Done.
-     [Task 16/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 16/25]  Current/Best:   15.03/  18.73 GFLOPS | Progress: (4/20) | 5.32 s     [Task 16/25]  Current/Best:   13.19/  18.73 GFLOPS | Progress: (8/20) | 7.39 s     [Task 16/25]  Current/Best:    9.65/  21.93 GFLOPS | Progress: (12/20) | 9.31 s     [Task 16/25]  Current/Best:   15.00/  21.93 GFLOPS | Progress: (16/20) | 11.10 s     [Task 16/25]  Current/Best:   13.77/  21.93 GFLOPS | Progress: (20/20) | 14.31 s Done.
-     [Task 17/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 17/25]  Current/Best:   11.97/  21.77 GFLOPS | Progress: (4/20) | 5.94 s     [Task 17/25]  Current/Best:   21.08/  21.77 GFLOPS | Progress: (8/20) | 8.59 s     [Task 17/25]  Current/Best:   19.87/  21.77 GFLOPS | Progress: (12/20) | 11.84 s     [Task 17/25]  Current/Best:   19.02/  21.77 GFLOPS | Progress: (16/20) | 14.05 s     [Task 17/25]  Current/Best:   17.72/  21.77 GFLOPS | Progress: (20/20) | 18.12 s Done.
-     [Task 18/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 18/25]  Current/Best:    7.14/   9.34 GFLOPS | Progress: (4/20) | 6.83 s     [Task 18/25]  Current/Best:    7.62/  17.35 GFLOPS | Progress: (8/20) | 14.49 s     [Task 18/25]  Current/Best:   10.98/  20.59 GFLOPS | Progress: (12/20) | 16.68 s     [Task 18/25]  Current/Best:    9.43/  20.59 GFLOPS | Progress: (16/20) | 21.59 s     [Task 18/25]  Current/Best:   14.67/  20.59 GFLOPS | Progress: (20/20) | 23.63 s Done.
-     [Task 19/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 19/25]  Current/Best:    5.34/  18.71 GFLOPS | Progress: (4/20) | 6.84 s     [Task 19/25]  Current/Best:    8.29/  20.47 GFLOPS | Progress: (8/20) | 10.01 s     [Task 19/25]  Current/Best:   10.68/  20.47 GFLOPS | Progress: (12/20) | 13.18 s     [Task 19/25]  Current/Best:   12.87/  21.12 GFLOPS | Progress: (16/20) | 15.89 s     [Task 19/25]  Current/Best:    6.76/  21.12 GFLOPS | Progress: (20/20) | 20.55 s Done.
-     [Task 20/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 20/25]  Current/Best:   11.68/  18.78 GFLOPS | Progress: (4/20) | 9.29 s     [Task 20/25]  Current/Best:    5.27/  18.78 GFLOPS | Progress: (8/20) | 20.97 s     [Task 20/25]  Current/Best:   11.12/  18.78 GFLOPS | Progress: (12/20) | 33.89 s     [Task 20/25]  Current/Best:   14.92/  18.78 GFLOPS | Progress: (16/20) | 45.26 s     [Task 20/25]  Current/Best:    7.93/  18.78 GFLOPS | Progress: (20/20) | 58.12 s     [Task 21/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s Done.
-     [Task 21/25]  Current/Best:   10.60/  10.60 GFLOPS | Progress: (4/20) | 15.16 s     [Task 21/25]  Current/Best:   16.31/  18.06 GFLOPS | Progress: (8/20) | 26.22 s     [Task 21/25]  Current/Best:    4.82/  18.06 GFLOPS | Progress: (12/20) | 31.63 s     [Task 21/25]  Current/Best:   19.27/  19.27 GFLOPS | Progress: (16/20) | 34.43 s     [Task 21/25]  Current/Best:    9.97/  19.27 GFLOPS | Progress: (20/20) | 45.72 s     [Task 22/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 22/25]  Current/Best:   12.58/  20.63 GFLOPS | Progress: (4/20) | 5.54 s     [Task 22/25]  Current/Best:   21.03/  21.90 GFLOPS | Progress: (8/20) | 7.12 s     [Task 22/25]  Current/Best:    5.01/  21.90 GFLOPS | Progress: (12/20) | 13.98 s     [Task 22/25]  Current/Best:   13.10/  21.90 GFLOPS | Progress: (16/20) | 15.85 s     [Task 22/25]  Current/Best:   11.89/  21.90 GFLOPS | Progress: (20/20) | 17.78 s Done.
-     [Task 23/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 23/25]  Current/Best:    1.55/  20.74 GFLOPS | Progress: (4/20) | 7.46 s     [Task 23/25]  Current/Best:   22.73/  22.73 GFLOPS | Progress: (8/20) | 10.46 s     [Task 23/25]  Current/Best:   12.07/  22.73 GFLOPS | Progress: (12/20) | 13.11 s     [Task 23/25]  Current/Best:   16.75/  22.73 GFLOPS | Progress: (16/20) | 16.55 s     [Task 23/25]  Current/Best:   12.34/  22.73 GFLOPS | Progress: (20/20) | 20.19 s Done.
-     [Task 24/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 24/25]  Current/Best:    8.85/   8.85 GFLOPS | Progress: (4/20) | 7.53 s     [Task 24/25]  Current/Best:    2.10/   8.85 GFLOPS | Progress: (8/20) | 18.56 s     [Task 24/25]  Current/Best:    0.95/   8.85 GFLOPS | Progress: (12/20) | 29.62 s     [Task 24/25]  Current/Best:    9.59/   9.59 GFLOPS | Progress: (16/20) | 42.01 s Done.
-     [Task 24/25]  Current/Best:    3.98/   9.59 GFLOPS | Progress: (20/20) | 53.01 s     [Task 25/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 25/25]  Current/Best:    7.72/   7.72 GFLOPS | Progress: (4/20) | 7.65 s     [Task 25/25]  Current/Best:    1.54/   9.24 GFLOPS | Progress: (8/20) | 10.60 s     [Task 25/25]  Current/Best:    2.99/   9.24 GFLOPS | Progress: (12/20) | 15.61 s     [Task 25/25]  Current/Best:    5.55/   9.24 GFLOPS | Progress: (16/20) | 17.82 s     [Task 25/25]  Current/Best:    1.52/   9.24 GFLOPS | Progress: (20/20) | 28.51 s
+     [Task  1/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  1/25]  Current/Best:   13.35/  16.35 GFLOPS | Progress: (4/20) | 9.05 s     [Task  1/25]  Current/Best:   20.95/  20.95 GFLOPS | Progress: (8/20) | 11.39 s     [Task  1/25]  Current/Best:   23.80/  23.80 GFLOPS | Progress: (12/20) | 13.37 s     [Task  1/25]  Current/Best:   18.53/  24.41 GFLOPS | Progress: (16/20) | 16.14 s     [Task  1/25]  Current/Best:   15.94/  24.41 GFLOPS | Progress: (20/20) | 18.52 s Done.
+     [Task  2/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  2/25]  Current/Best:   18.96/  18.96 GFLOPS | Progress: (4/20) | 3.85 s     [Task  2/25]  Current/Best:    9.18/  22.58 GFLOPS | Progress: (8/20) | 5.38 s     [Task  2/25]  Current/Best:    6.46/  22.58 GFLOPS | Progress: (12/20) | 7.03 s     [Task  2/25]  Current/Best:    3.51/  22.58 GFLOPS | Progress: (16/20) | 8.59 s     [Task  2/25]  Current/Best:    7.97/  22.58 GFLOPS | Progress: (20/20) | 10.27 s Done.
+     [Task  3/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  3/25]  Current/Best:    6.44/  20.95 GFLOPS | Progress: (4/20) | 4.64 s     [Task  3/25]  Current/Best:    6.42/  20.95 GFLOPS | Progress: (8/20) | 7.39 s     [Task  3/25]  Current/Best:    8.39/  20.95 GFLOPS | Progress: (12/20) | 9.95 s     [Task  3/25]  Current/Best:    6.65/  20.95 GFLOPS | Progress: (16/20) | 12.31 s     [Task  3/25]  Current/Best:    8.44/  20.95 GFLOPS | Progress: (20/20) | 14.70 s Done.
+     [Task  4/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  4/25]  Current/Best:   11.00/  20.17 GFLOPS | Progress: (4/20) | 5.23 s     [Task  4/25]  Current/Best:    7.93/  20.17 GFLOPS | Progress: (8/20) | 8.36 s     [Task  4/25]  Current/Best:    9.49/  20.83 GFLOPS | Progress: (12/20) | 13.38 s     [Task  4/25]  Current/Best:    8.70/  20.83 GFLOPS | Progress: (16/20) | 15.21 s     [Task  4/25]  Current/Best:   13.11/  21.66 GFLOPS | Progress: (20/20) | 22.72 s Done.
+     [Task  5/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  5/25]  Current/Best:    4.49/  18.59 GFLOPS | Progress: (4/20) | 4.66 s     [Task  5/25]  Current/Best:   11.28/  18.59 GFLOPS | Progress: (8/20) | 6.77 s     [Task  5/25]  Current/Best:   13.84/  19.55 GFLOPS | Progress: (12/20) | 8.40 s     [Task  5/25]  Current/Best:   12.94/  19.55 GFLOPS | Progress: (16/20) | 11.22 s     [Task  5/25]  Current/Best:    6.48/  19.96 GFLOPS | Progress: (20/20) | 13.43 s Done.
+     [Task  6/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  6/25]  Current/Best:   11.12/  15.81 GFLOPS | Progress: (4/20) | 7.41 s     [Task  6/25]  Current/Best:   16.56/  21.18 GFLOPS | Progress: (8/20) | 10.53 s     [Task  6/25]  Current/Best:   12.31/  22.07 GFLOPS | Progress: (12/20) | 14.17 s     [Task  6/25]  Current/Best:   17.64/  22.07 GFLOPS | Progress: (16/20) | 16.15 s     [Task  6/25]  Current/Best:   23.17/  23.17 GFLOPS | Progress: (20/20) | 18.67 s Done.
+     [Task  7/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  7/25]  Current/Best:   13.84/  16.97 GFLOPS | Progress: (4/20) | 4.61 s     [Task  7/25]  Current/Best:   12.39/  19.19 GFLOPS | Progress: (8/20) | 8.16 s     [Task  7/25]  Current/Best:   13.44/  21.82 GFLOPS | Progress: (12/20) | 10.89 s     [Task  7/25]  Current/Best:   18.91/  21.82 GFLOPS | Progress: (16/20) | 13.07 s     [Task  7/25]  Current/Best:   10.72/  21.82 GFLOPS | Progress: (20/20) | 17.41 s Done.
+     [Task  8/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  8/25]  Current/Best:    3.63/  18.60 GFLOPS | Progress: (4/20) | 5.00 s     [Task  8/25]  Current/Best:   11.13/  18.60 GFLOPS | Progress: (8/20) | 8.61 s     [Task  8/25]  Current/Best:   12.98/  18.60 GFLOPS | Progress: (12/20) | 11.21 s     [Task  8/25]  Current/Best:   15.96/  18.60 GFLOPS | Progress: (16/20) | 13.40 s     [Task  8/25]  Current/Best:    3.05/  18.60 GFLOPS | Progress: (20/20) | 16.55 s Done.
+     [Task  9/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task  9/25]  Current/Best:   17.15/  18.14 GFLOPS | Progress: (4/20) | 10.28 s     [Task  9/25]  Current/Best:   15.92/  18.45 GFLOPS | Progress: (8/20) | 12.38 s     [Task  9/25]  Current/Best:    9.44/  18.45 GFLOPS | Progress: (12/20) | 17.74 s     [Task  9/25]  Current/Best:   17.44/  18.45 GFLOPS | Progress: (16/20) | 20.16 s     [Task  9/25]  Current/Best:   22.32/  22.32 GFLOPS | Progress: (20/20) | 22.49 s Done.
+     [Task 10/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 10/25]  Current/Best:    9.02/  12.22 GFLOPS | Progress: (4/20) | 4.51 s     [Task 10/25]  Current/Best:   18.76/  18.76 GFLOPS | Progress: (8/20) | 6.25 s     [Task 10/25]  Current/Best:    6.26/  20.16 GFLOPS | Progress: (12/20) | 7.87 s     [Task 10/25]  Current/Best:    3.17/  20.16 GFLOPS | Progress: (16/20) | 10.09 s     [Task 10/25]  Current/Best:   10.37/  21.58 GFLOPS | Progress: (20/20) | 12.15 s Done.
+     [Task 11/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 11/25]  Current/Best:    9.78/  18.87 GFLOPS | Progress: (4/20) | 4.66 s     [Task 11/25]  Current/Best:   21.41/  21.41 GFLOPS | Progress: (8/20) | 6.89 s     [Task 11/25]  Current/Best:   21.94/  21.94 GFLOPS | Progress: (12/20) | 9.00 s     [Task 11/25]  Current/Best:   14.37/  21.94 GFLOPS | Progress: (16/20) | 11.39 s     [Task 11/25]  Current/Best:    6.46/  21.94 GFLOPS | Progress: (20/20) | 14.40 s Done.
+     [Task 12/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 12/25]  Current/Best:   14.53/  15.63 GFLOPS | Progress: (4/20) | 4.86 s     [Task 12/25]  Current/Best:   14.80/  16.46 GFLOPS | Progress: (8/20) | 7.05 s     [Task 12/25]  Current/Best:   12.93/  16.46 GFLOPS | Progress: (12/20) | 9.48 s     [Task 12/25]  Current/Best:   14.84/  21.84 GFLOPS | Progress: (16/20) | 12.08 s     [Task 12/25]  Current/Best:   11.69/  21.84 GFLOPS | Progress: (20/20) | 14.91 s Done.
+     [Task 13/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 13/25]  Current/Best:   13.06/  15.95 GFLOPS | Progress: (4/20) | 6.04 s     [Task 13/25]  Current/Best:   20.37/  21.57 GFLOPS | Progress: (8/20) | 8.27 s     [Task 13/25]  Current/Best:   17.56/  21.57 GFLOPS | Progress: (12/20) | 12.57 s     [Task 13/25]  Current/Best:    9.53/  22.89 GFLOPS | Progress: (16/20) | 16.01 s     [Task 13/25]  Current/Best:    6.17/  22.89 GFLOPS | Progress: (20/20) | 19.18 s Done.
+     [Task 14/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 14/25]  Current/Best:   13.30/  19.36 GFLOPS | Progress: (4/20) | 6.52 s     [Task 14/25]  Current/Best:    2.43/  20.73 GFLOPS | Progress: (8/20) | 13.17 s     [Task 14/25]  Current/Best:    3.11/  20.73 GFLOPS | Progress: (12/20) | 16.75 s     [Task 14/25]  Current/Best:   15.91/  20.73 GFLOPS | Progress: (16/20) | 23.37 s     [Task 14/25]  Current/Best:    7.95/  21.68 GFLOPS | Progress: (20/20) | 28.15 s Done.
+     [Task 15/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 15/25]  Current/Best:    7.34/  23.18 GFLOPS | Progress: (4/20) | 7.79 s     [Task 15/25]  Current/Best:   21.76/  23.18 GFLOPS | Progress: (8/20) | 12.19 s     [Task 15/25]  Current/Best:   15.81/  23.18 GFLOPS | Progress: (12/20) | 23.20 s     [Task 15/25]  Current/Best:    6.34/  23.18 GFLOPS | Progress: (16/20) | 31.40 s     [Task 15/25]  Current/Best:    6.25/  23.18 GFLOPS | Progress: (20/20) | 33.30 s     [Task 16/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 16/25]  Current/Best:   11.21/  19.50 GFLOPS | Progress: (4/20) | 5.14 s     [Task 16/25]  Current/Best:    6.75/  19.50 GFLOPS | Progress: (8/20) | 7.81 s     [Task 16/25]  Current/Best:   19.29/  19.50 GFLOPS | Progress: (12/20) | 9.79 s     [Task 16/25]  Current/Best:   20.74/  20.74 GFLOPS | Progress: (16/20) | 11.46 s     [Task 16/25]  Current/Best:    7.78/  21.01 GFLOPS | Progress: (20/20
 ) | 14.52 s Done.
+     [Task 17/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 17/25]  Current/Best:   12.75/  22.53 GFLOPS | Progress: (4/20) | 5.44 s     [Task 17/25]  Current/Best:   20.86/  22.95 GFLOPS | Progress: (8/20) | 9.06 s     [Task 17/25]  Current/Best:   12.20/  23.64 GFLOPS | Progress: (12/20) | 11.90 s     [Task 17/25]  Current/Best:   14.40/  23.64 GFLOPS | Progress: (16/20) | 14.27 s     [Task 17/25]  Current/Best:   23.02/  23.71 GFLOPS | Progress: (20/20) | 18.23 s Done.
+     [Task 18/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 18/25]  Current/Best:    3.18/  15.08 GFLOPS | Progress: (4/20) | 6.27 s     [Task 18/25]  Current/Best:   19.23/  19.23 GFLOPS | Progress: (8/20) | 8.76 s     [Task 18/25]  Current/Best:   13.78/  19.23 GFLOPS | Progress: (12/20) | 11.19 s     [Task 18/25]  Current/Best:    3.18/  19.23 GFLOPS | Progress: (16/20) | 19.50 s     [Task 18/25]  Current/Best:   16.95/  19.23 GFLOPS | Progress: (20/20) | 21.35 s Done.
+     [Task 19/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 19/25]  Current/Best:    2.79/  19.12 GFLOPS | Progress: (4/20) | 7.20 s     [Task 19/25]  Current/Best:   20.76/  22.59 GFLOPS | Progress: (8/20) | 9.66 s     [Task 19/25]  Current/Best:    9.34/  22.59 GFLOPS | Progress: (12/20) | 12.96 s     [Task 19/25]  Current/Best:   12.16/  22.88 GFLOPS | Progress: (16/20) | 17.34 s     [Task 19/25]  Current/Best:    9.35/  22.88 GFLOPS | Progress: (20/20) | 23.84 s Done.
+     [Task 20/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 20/25]  Current/Best:   12.61/  17.02 GFLOPS | Progress: (4/20) | 6.61 s     [Task 20/25]  Current/Best:    7.21/  18.64 GFLOPS | Progress: (8/20) | 12.47 s     [Task 20/25]  Current/Best:    9.08/  19.81 GFLOPS | Progress: (12/20) | 16.84 s     [Task 20/25]  Current/Best:   10.01/  19.81 GFLOPS | Progress: (16/20) | 22.09 s     [Task 20/25]  Current/Best:   10.39/  19.81 GFLOPS | Progress: (20/20) | 29.01 s Done.
+     [Task 21/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 21/25]  Current/Best:   11.01/  23.32 GFLOPS | Progress: (4/20) | 13.53 s     [Task 21/25]  Current/Best:    5.52/  23.32 GFLOPS | Progress: (8/20) | 16.43 s     [Task 21/25]  Current/Best:   16.53/  23.32 GFLOPS | Progress: (12/20) | 27.18 s Done.
+     [Task 21/25]  Current/Best:    9.79/  23.32 GFLOPS | Progress: (16/20) | 30.73 s     [Task 21/25]  Current/Best:    7.26/  23.32 GFLOPS | Progress: (20/20) | 42.09 s     [Task 22/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 22/25]  Current/Best:   15.84/  19.32 GFLOPS | Progress: (4/20) | 5.35 s     [Task 22/25]  Current/Best:   11.24/  20.77 GFLOPS | Progress: (8/20) | 7.35 s     [Task 22/25]  Current/Best:   19.66/  20.77 GFLOPS | Progress: (12/20) | 9.30 s     [Task 22/25]  Current/Best:    8.19/  20.77 GFLOPS | Progress: (16/20) | 11.39 s     [Task 22/25]  Current/Best:    8.46/  20.77 GFLOPS | Progress: (20/20) | 15.79 s Done.
+     [Task 23/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 23/25]  Current/Best:   24.05/  24.05 GFLOPS | Progress: (4/20) | 8.07 s     [Task 23/25]  Current/Best:   10.32/  24.05 GFLOPS | Progress: (8/20) | 12.19 s     [Task 23/25]  Current/Best:   24.79/  24.79 GFLOPS | Progress: (12/20) | 15.07 s     [Task 23/25]  Current/Best:   15.39/  24.79 GFLOPS | Progress: (16/20) | 19.25 s     [Task 23/25]  Current/Best:   19.78/  24.79 GFLOPS | Progress: (20/20) | 22.61 s Done.
+     [Task 24/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 24/25]  Current/Best:    2.59/   8.43 GFLOPS | Progress: (4/20) | 6.51 s     [Task 24/25]  Current/Best:    5.50/   8.43 GFLOPS | Progress: (8/20) | 16.91 s     [Task 24/25]  Current/Best:    3.18/  10.59 GFLOPS | Progress: (12/20) | 29.50 s     [Task 24/25]  Current/Best:    3.40/  10.59 GFLOPS | Progress: (16/20) | 40.52 s     [Task 24/25]  Current/Best:    3.78/  10.59 GFLOPS | Progress: (20/20) | 51.58 s     [Task 25/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s     [Task 25/25]  Current/Best:    1.50/   9.71 GFLOPS | Progress: (4/20) | 13.42 s Done.
+     Done.
+     [Task 25/25]  Current/Best:    1.60/   9.71 GFLOPS | Progress: (8/20) | 20.87 s     [Task 25/25]  Current/Best:   10.61/  10.61 GFLOPS | Progress: (12/20) | 31.47 s     [Task 25/25]  Current/Best:    5.74/  10.61 GFLOPS | Progress: (16/20) | 34.57 s     [Task 25/25]  Current/Best:   10.23/  10.61 GFLOPS | Progress: (20/20) | 45.52 s
 
 
 
@@ -674,7 +675,6 @@ model using optimized operators to speed up our computations.
  .. code-block:: none
 
      Done.
-     Done.
 
 
 
@@ -766,8 +766,8 @@ improvement in comparing the optimized model to the unoptimized model.
 
  .. code-block:: none
 
-    optimized: {'mean': 415.74742305001564, 'median': 415.8195728499777, 'std': 1.7602679764603735}
-    unoptimized: {'mean': 492.71140236000065, 'median': 492.71483809998244, 'std': 3.6515681597064877}
+    optimized: {'mean': 379.3111107200457, 'median': 378.54401690001396, 'std': 2.0323214236615597}
+    unoptimized: {'mean': 466.5144750900072, 'median': 466.5101404500092, 'std': 1.2822160389899164}
 
 
 
@@ -790,7 +790,7 @@ profiling/benchmarking.
 
 .. rst-class:: sphx-glr-timing
 
-   **Total running time of the script:** ( 13 minutes  55.849 seconds)
+   **Total running time of the script:** ( 13 minutes  22.695 seconds)
 
 
 .. _sphx_glr_download_tutorial_autotvm_relay_x86.py:
diff --git a/docs/_sources/tutorial/cross_compilation_and_rpc.rst.txt b/docs/_sources/tutorial/cross_compilation_and_rpc.rst.txt
index 619c25c6fe..7fa48b4e9d 100644
--- a/docs/_sources/tutorial/cross_compilation_and_rpc.rst.txt
+++ b/docs/_sources/tutorial/cross_compilation_and_rpc.rst.txt
@@ -274,7 +274,7 @@ device and returns the measured cost. Network overhead is excluded.
 
  .. code-block:: none
 
-    1.269e-07 secs/op
+    1.207e-07 secs/op
 
 
 
diff --git a/docs/_sources/tutorial/intro_topi.rst.txt b/docs/_sources/tutorial/intro_topi.rst.txt
index 6e59b8c584..6738f4c075 100644
--- a/docs/_sources/tutorial/intro_topi.rst.txt
+++ b/docs/_sources/tutorial/intro_topi.rst.txt
@@ -270,7 +270,7 @@ As you can see, scheduled stages of computation have been accumulated and we can
 
  .. code-block:: none
 
-    [stage(a, placeholder(a, 0x2d9615b0)), stage(b, placeholder(b, 0x18f334d0)), stage(T_add, compute(T_add, body=[a[ax0, ax1, ax2] + b[ax1, ax2]], axis=[T.iter_var(ax0, T.Range(0, 100), "DataPar", ""), T.iter_var(ax1, T.Range(0, 10), "DataPar", ""), T.iter_var(ax2, T.Range(0, 10), "DataPar", "")], reduce_axis=[], tag=broadcast, attrs={})), stage(T_multiply, compute(T_multiply, body=[a[ax0, ax1, ax2] * b[ax1, ax2]], axis=[T.iter_var(ax0, T.Range(0, 100), "DataPar", ""), T.iter_var(ax1, T [...]
+    [stage(a, placeholder(a, 0x1081b4e0)), stage(b, placeholder(b, 0x107ff4b0)), stage(T_add, compute(T_add, body=[a[ax0, ax1, ax2] + b[ax1, ax2]], axis=[T.iter_var(ax0, T.Range(0, 100), "DataPar", ""), T.iter_var(ax1, T.Range(0, 10), "DataPar", ""), T.iter_var(ax2, T.Range(0, 10), "DataPar", "")], reduce_axis=[], tag=broadcast, attrs={})), stage(T_multiply, compute(T_multiply, body=[a[ax0, ax1, ax2] * b[ax1, ax2]], axis=[T.iter_var(ax0, T.Range(0, 100), "DataPar", ""), T.iter_var(ax1, T [...]
 
 
 
diff --git a/docs/_sources/tutorial/sg_execution_times.rst.txt b/docs/_sources/tutorial/sg_execution_times.rst.txt
index ce1edd8bd1..523b2be330 100644
--- a/docs/_sources/tutorial/sg_execution_times.rst.txt
+++ b/docs/_sources/tutorial/sg_execution_times.rst.txt
@@ -5,24 +5,24 @@
 
 Computation times
 =================
-**17:32.525** total execution time for **tutorial** files:
+**16:46.661** total execution time for **tutorial** files:
 
 +------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_tutorial_autotvm_relay_x86.py` (``autotvm_relay_x86.py``)                 | 13:55.849 | 0.0 MB |
+| :ref:`sphx_glr_tutorial_autotvm_relay_x86.py` (``autotvm_relay_x86.py``)                 | 13:22.695 | 0.0 MB |
 +------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_tutorial_auto_scheduler_matmul_x86.py` (``auto_scheduler_matmul_x86.py``) | 01:31.186 | 0.0 MB |
+| :ref:`sphx_glr_tutorial_auto_scheduler_matmul_x86.py` (``auto_scheduler_matmul_x86.py``) | 01:30.520 | 0.0 MB |
 +------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_tutorial_tensor_expr_get_started.py` (``tensor_expr_get_started.py``)     | 01:01.600 | 0.0 MB |
+| :ref:`sphx_glr_tutorial_tensor_expr_get_started.py` (``tensor_expr_get_started.py``)     | 00:58.454 | 0.0 MB |
 +------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_tutorial_relay_quick_start.py` (``relay_quick_start.py``)                 | 00:42.693 | 0.0 MB |
+| :ref:`sphx_glr_tutorial_relay_quick_start.py` (``relay_quick_start.py``)                 | 00:40.197 | 0.0 MB |
 +------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_tutorial_autotvm_matmul_x86.py` (``autotvm_matmul_x86.py``)               | 00:19.107 | 0.0 MB |
+| :ref:`sphx_glr_tutorial_autotvm_matmul_x86.py` (``autotvm_matmul_x86.py``)               | 00:12.681 | 0.0 MB |
 +------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_tutorial_intro_topi.py` (``intro_topi.py``)                               | 00:00.970 | 0.0 MB |
+| :ref:`sphx_glr_tutorial_tensor_ir_blitz_course.py` (``tensor_ir_blitz_course.py``)       | 00:00.989 | 0.0 MB |
 +------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_tutorial_tensor_ir_blitz_course.py` (``tensor_ir_blitz_course.py``)       | 00:00.921 | 0.0 MB |
+| :ref:`sphx_glr_tutorial_intro_topi.py` (``intro_topi.py``)                               | 00:00.929 | 0.0 MB |
 +------------------------------------------------------------------------------------------+-----------+--------+
-| :ref:`sphx_glr_tutorial_cross_compilation_and_rpc.py` (``cross_compilation_and_rpc.py``) | 00:00.197 | 0.0 MB |
+| :ref:`sphx_glr_tutorial_cross_compilation_and_rpc.py` (``cross_compilation_and_rpc.py``) | 00:00.196 | 0.0 MB |
 +------------------------------------------------------------------------------------------+-----------+--------+
 | :ref:`sphx_glr_tutorial_uma.py` (``uma.py``)                                             | 00:00.000 | 0.0 MB |
 +------------------------------------------------------------------------------------------+-----------+--------+
diff --git a/docs/_sources/tutorial/tensor_expr_get_started.rst.txt b/docs/_sources/tutorial/tensor_expr_get_started.rst.txt
index 2155c5aa19..1e0a3ad1b0 100644
--- a/docs/_sources/tutorial/tensor_expr_get_started.rst.txt
+++ b/docs/_sources/tutorial/tensor_expr_get_started.rst.txt
@@ -285,8 +285,8 @@ helper function to run a profile of the TVM generated code.
 
  .. code-block:: none
 
-    Numpy running time: 0.000008
-    naive: 0.000008
+    Numpy running time: 0.000006
+    naive: 0.000006
 
 
 
@@ -444,7 +444,7 @@ factor to be the number of threads on your CPU.
 
  .. code-block:: none
 
-    vector: 0.000039
+    vector: 0.000038
     # from tvm.script import ir as I
     # from tvm.script import tir as T
 
@@ -498,10 +498,10 @@ We can now compare the different schedules
  .. code-block:: none
 
                 Operator                  Timing             Performance
-                   numpy    7.671109997318127e-06                    1.0
-                   naive    7.830600000000001e-06      1.020790994098329
-                parallel    7.766199999999999e-06     1.0123958596233291
-                  vector             3.92464e-05       5.116130522665014
+                   numpy    5.601880002359394e-06                    1.0
+                   naive              5.6479e-06      1.0082150987920524
+                parallel    7.738200000000002e-06      1.381357686480402
+                  vector    3.7925999999999996e-05     6.770227135180754
 
 
 
@@ -922,7 +922,7 @@ matrix multiplication.
 
  .. code-block:: none
 
-    Numpy running time: 0.018811
+    Numpy running time: 0.013683
 
 
 
@@ -980,7 +980,7 @@ optimizations.
 
  .. code-block:: none
 
-    none: 3.526149
+    none: 3.380417
 
 
 
@@ -1080,7 +1080,7 @@ schedule.
 
  .. code-block:: none
 
-    blocking: 0.287237
+    blocking: 0.290640
 
 
 
@@ -1164,7 +1164,7 @@ already cache friendly from our previous optimizations.
 
  .. code-block:: none
 
-    vectorization: 0.272752
+    vectorization: 0.265125
     # from tvm.script import ir as I
     # from tvm.script import tir as T
 
@@ -1230,7 +1230,7 @@ more cache friendly.
 
  .. code-block:: none
 
-    loop permutation: 0.118667
+    loop permutation: 0.112804
     # from tvm.script import ir as I
     # from tvm.script import tir as T
 
@@ -1321,7 +1321,7 @@ optimized schedule.
 
  .. code-block:: none
 
-    array packing: 0.107560
+    array packing: 0.103097
     # from tvm.script import ir as I
     # from tvm.script import tir as T
 
@@ -1404,7 +1404,7 @@ to `C` when all the block results are ready.
 
  .. code-block:: none
 
-    block caching: 0.111709
+    block caching: 0.094137
     # from tvm.script import ir as I
     # from tvm.script import tir as T
 
@@ -1478,7 +1478,7 @@ of thread-level parallelization.
 
  .. code-block:: none
 
-    parallelization: 0.132621
+    parallelization: 0.112458
     # from tvm.script import ir as I
     # from tvm.script import tir as T
 
@@ -1548,13 +1548,13 @@ working, we can compare the results.
  .. code-block:: none
 
                 Operator                  Timing             Performance
-                    none      3.5261488836999995                     1.0
-                blocking     0.28723746299999997     0.08145925554300497
-           vectorization            0.2727517516      0.0773511727938727
-        loop permutation             0.118667046     0.03365344173314721
-           array packing            0.1075599755    0.030503526381772336
-           block caching            0.1117090457    0.031680184071746664
-         parallelization     0.13262143129999998     0.03761084278461886
+                    none      3.3804169742999997                     1.0
+                blocking     0.29064017789999996     0.08597761167028346
+           vectorization     0.26512477300000004     0.07842960647033811
+        loop permutation            0.1128040781    0.033369870923500175
+           array packing     0.10309722699999999    0.030498375728144858
+           block caching            0.0941365962     0.02784762853685923
+         parallelization            0.1124583613     0.03326760046319059
 
 
 
@@ -1594,11 +1594,6 @@ operations with tunable parameters that allows you to automatically optimize
 the computation for specific platforms.
 
 
-.. rst-class:: sphx-glr-timing
-
-   **Total running time of the script:** ( 1 minutes  1.600 seconds)
-
-
 .. _sphx_glr_download_tutorial_tensor_expr_get_started.py:
 
 .. only:: html
diff --git a/docs/commit_hash b/docs/commit_hash
index a01dcbe9ea..e4cd2456bd 100644
--- a/docs/commit_hash
+++ b/docs/commit_hash
@@ -1 +1 @@
-efc2ae9846054506ee4596d49c55b6ecc7c89800
+5645c52c6d3105fb6c58cb7e1d983eff6ff26c19
diff --git a/docs/how_to/compile_models/from_darknet.html b/docs/how_to/compile_models/from_darknet.html
index 8ee6a94a77..18d0112bf0 100644
--- a/docs/how_to/compile_models/from_darknet.html
+++ b/docs/how_to/compile_models/from_darknet.html
@@ -604,7 +604,7 @@ class:[&#39;truck 0.9266&#39;] left:471 top:83 right:689 bottom:169
 class:[&#39;bicycle 0.9984&#39;] left:111 top:113 right:577 bottom:447
 </pre></div>
 </div>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  34.628 seconds)</p>
+<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  33.114 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-compile-models-from-darknet-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../../_downloads/7716f96385bd5abb6e822041e285be54/from_darknet.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">from_darknet.py</span></code></a></p>
diff --git a/docs/how_to/compile_models/from_oneflow.html b/docs/how_to/compile_models/from_oneflow.html
index fcb422009c..a083f82c68 100644
--- a/docs/how_to/compile_models/from_oneflow.html
+++ b/docs/how_to/compile_models/from_oneflow.html
@@ -468,13 +468,14 @@ Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdo
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Downloading: &quot;https://oneflow-public.oss-cn-beijing.aliyuncs.com/model_zoo/flowvision/classification/ResNet/resnet18.zip&quot; to /workspace/.oneflow/flowvision_cache/resnet18.zip
 
   0%|          | 0.00/41.5M [00:00&lt;?, ?B/s]
- 15%|#5        | 6.33M/41.5M [00:00&lt;00:00, 65.6MB/s]
- 35%|###4      | 14.3M/41.5M [00:00&lt;00:00, 76.2MB/s]
- 52%|#####2    | 21.6M/41.5M [00:00&lt;00:00, 41.1MB/s]
- 64%|######4   | 26.7M/41.5M [00:00&lt;00:00, 25.5MB/s]
- 77%|#######7  | 32.0M/41.5M [00:01&lt;00:00, 29.1MB/s]
- 92%|#########2| 38.3M/41.5M [00:01&lt;00:00, 35.2MB/s]
-100%|##########| 41.5M/41.5M [00:01&lt;00:00, 36.1MB/s]
+ 19%|#8        | 7.72M/41.5M [00:00&lt;00:00, 80.9MB/s]
+ 37%|###7      | 15.4M/41.5M [00:00&lt;00:00, 33.5MB/s]
+ 48%|####8     | 20.0M/41.5M [00:00&lt;00:00, 32.6MB/s]
+ 57%|#####7    | 23.8M/41.5M [00:00&lt;00:00, 21.9MB/s]
+ 64%|######3   | 26.5M/41.5M [00:01&lt;00:00, 20.2MB/s]
+ 77%|#######7  | 32.0M/41.5M [00:01&lt;00:00, 25.2MB/s]
+ 92%|#########2| 38.3M/41.5M [00:01&lt;00:00, 30.4MB/s]
+100%|##########| 41.5M/41.5M [00:01&lt;00:00, 28.6MB/s]
 </pre></div>
 </div>
 </div>
diff --git a/docs/how_to/compile_models/from_paddle.html b/docs/how_to/compile_models/from_paddle.html
index 1e929e3092..12b26b40e5 100644
--- a/docs/how_to/compile_models/from_paddle.html
+++ b/docs/how_to/compile_models/from_paddle.html
@@ -503,7 +503,6 @@ To begin, we’ll install PaddlePaddle&gt;=2.1.3:</p>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>TVM prediction top-1 id: 282, class name:  282: &#39;tiger cat&#39;,
 </pre></div>
 </div>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  1.325 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-compile-models-from-paddle-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../../_downloads/16269b77359771348d507395692524cf/from_paddle.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">from_paddle.py</span></code></a></p>
diff --git a/docs/how_to/compile_models/from_pytorch.html b/docs/how_to/compile_models/from_pytorch.html
index 54ed4ab737..5d615b6207 100644
--- a/docs/how_to/compile_models/from_pytorch.html
+++ b/docs/how_to/compile_models/from_pytorch.html
@@ -451,14 +451,17 @@ be unstable.</p>
 Downloading: &quot;https://download.pytorch.org/models/resnet18-f37072fd.pth&quot; to /workspace/.cache/torch/hub/checkpoints/resnet18-f37072fd.pth
 
   0%|          | 0.00/44.7M [00:00&lt;?, ?B/s]
- 15%|#5        | 6.84M/44.7M [00:00&lt;00:00, 71.7MB/s]
- 31%|###       | 13.7M/44.7M [00:00&lt;00:00, 52.3MB/s]
- 42%|####2     | 19.0M/44.7M [00:00&lt;00:00, 45.7MB/s]
- 54%|#####3    | 24.0M/44.7M [00:00&lt;00:00, 38.8MB/s]
- 72%|#######1  | 32.0M/44.7M [00:00&lt;00:00, 38.0MB/s]
- 86%|########5 | 38.3M/44.7M [00:01&lt;00:00, 35.1MB/s]
- 94%|#########3| 41.8M/44.7M [00:01&lt;00:00, 32.9MB/s]
-100%|##########| 44.7M/44.7M [00:01&lt;00:00, 39.9MB/s]
+ 14%|#4        | 6.30M/44.7M [00:00&lt;00:00, 44.3MB/s]
+ 24%|##3       | 10.5M/44.7M [00:00&lt;00:00, 39.6MB/s]
+ 32%|###2      | 14.3M/44.7M [00:00&lt;00:00, 33.1MB/s]
+ 39%|###9      | 17.5M/44.7M [00:00&lt;00:00, 30.1MB/s]
+ 54%|#####3    | 24.0M/44.7M [00:00&lt;00:00, 40.9MB/s]
+ 63%|######2   | 28.1M/44.7M [00:00&lt;00:00, 33.9MB/s]
+ 71%|#######   | 31.7M/44.7M [00:00&lt;00:00, 33.1MB/s]
+ 78%|#######8  | 35.0M/44.7M [00:01&lt;00:00, 29.2MB/s]
+ 86%|########5 | 38.3M/44.7M [00:01&lt;00:00, 28.8MB/s]
+100%|#########9| 44.7M/44.7M [00:01&lt;00:00, 38.1MB/s]
+100%|##########| 44.7M/44.7M [00:01&lt;00:00, 34.9MB/s]
 </pre></div>
 </div>
 </div>
diff --git a/docs/how_to/compile_models/from_tensorflow.html b/docs/how_to/compile_models/from_tensorflow.html
index 83858db177..7da76e90c8 100644
--- a/docs/how_to/compile_models/from_tensorflow.html
+++ b/docs/how_to/compile_models/from_tensorflow.html
@@ -671,7 +671,7 @@ banana (score = 0.00022)
 desk (score = 0.00019)
 </pre></div>
 </div>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  32.962 seconds)</p>
+<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  27.491 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-compile-models-from-tensorflow-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../../_downloads/7f1d3d1b878694c201c614c807cdebc8/from_tensorflow.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">from_tensorflow.py</span></code></a></p>
diff --git a/docs/how_to/compile_models/sg_execution_times.html b/docs/how_to/compile_models/sg_execution_times.html
index 4c56e950d5..0ad5e38a32 100644
--- a/docs/how_to/compile_models/sg_execution_times.html
+++ b/docs/how_to/compile_models/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-how-to-compile-models-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>06:41.067</strong> total execution time for <strong>how_to_compile_models</strong> files:</p>
+<p><strong>06:22.235</strong> total execution time for <strong>how_to_compile_models</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 81%" />
@@ -369,39 +369,39 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="from_darknet.html#sphx-glr-how-to-compile-models-from-darknet-py"><span class="std std-ref">Compile YOLO-V2 and YOLO-V3 in DarkNet Models</span></a> (<code class="docutils literal notranslate"><span class="pre">from_darknet.py</span></code>)</p></td>
-<td><p>01:34.628</p></td>
+<td><p>01:33.114</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="from_tensorflow.html#sphx-glr-how-to-compile-models-from-tensorflow-py"><span class="std std-ref">Compile Tensorflow Models</span></a> (<code class="docutils literal notranslate"><span class="pre">from_tensorflow.py</span></code>)</p></td>
-<td><p>01:32.962</p></td>
+<td><p>01:27.491</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="from_paddle.html#sphx-glr-how-to-compile-models-from-paddle-py"><span class="std std-ref">Compile PaddlePaddle Models</span></a> (<code class="docutils literal notranslate"><span class="pre">from_paddle.py</span></code>)</p></td>
-<td><p>01:01.325</p></td>
+<td><p>00:57.633</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="from_oneflow.html#sphx-glr-how-to-compile-models-from-oneflow-py"><span class="std std-ref">Compile OneFlow Models</span></a> (<code class="docutils literal notranslate"><span class="pre">from_oneflow.py</span></code>)</p></td>
-<td><p>00:43.983</p></td>
+<td><p>00:41.333</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="from_coreml.html#sphx-glr-how-to-compile-models-from-coreml-py"><span class="std std-ref">Compile CoreML Models</span></a> (<code class="docutils literal notranslate"><span class="pre">from_coreml.py</span></code>)</p></td>
-<td><p>00:37.967</p></td>
+<td><p>00:35.889</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="from_pytorch.html#sphx-glr-how-to-compile-models-from-pytorch-py"><span class="std std-ref">Compile PyTorch Models</span></a> (<code class="docutils literal notranslate"><span class="pre">from_pytorch.py</span></code>)</p></td>
-<td><p>00:29.332</p></td>
+<td><p>00:27.349</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="from_keras.html#sphx-glr-how-to-compile-models-from-keras-py"><span class="std std-ref">Compile Keras Models</span></a> (<code class="docutils literal notranslate"><span class="pre">from_keras.py</span></code>)</p></td>
-<td><p>00:25.769</p></td>
+<td><p>00:24.610</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="from_tflite.html#sphx-glr-how-to-compile-models-from-tflite-py"><span class="std std-ref">Compile TFLite Models</span></a> (<code class="docutils literal notranslate"><span class="pre">from_tflite.py</span></code>)</p></td>
-<td><p>00:12.048</p></td>
+<td><p>00:12.040</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="from_onnx.html#sphx-glr-how-to-compile-models-from-onnx-py"><span class="std std-ref">Compile ONNX Models</span></a> (<code class="docutils literal notranslate"><span class="pre">from_onnx.py</span></code>)</p></td>
-<td><p>00:03.054</p></td>
+<td><p>00:02.776</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 </tbody>
diff --git a/docs/how_to/deploy/adreno.html b/docs/how_to/deploy/adreno.html
index 2c285bdc13..db7f5e71b9 100644
--- a/docs/how_to/deploy/adreno.html
+++ b/docs/how_to/deploy/adreno.html
@@ -270,6 +270,7 @@
 <li class="toctree-l4"><a class="reference internal" href="tensorrt.html">Relay TensorRT Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="vitis_ai.html">Vitis AI Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="bnns.html">Relay BNNS Integration</a></li>
+<li class="toctree-l4"><a class="reference internal" href="mrvl.html">Marvell Machine Learning Integration</a></li>
 </ul>
 </li>
 <li class="toctree-l3"><a class="reference internal" href="index.html#additional-deployment-how-tos">Additional Deployment How-Tos</a></li>
diff --git a/docs/how_to/deploy/android.html b/docs/how_to/deploy/android.html
index 5f57c4d8ad..3be9e7fb1b 100644
--- a/docs/how_to/deploy/android.html
+++ b/docs/how_to/deploy/android.html
@@ -270,6 +270,7 @@
 <li class="toctree-l4"><a class="reference internal" href="tensorrt.html">Relay TensorRT Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="vitis_ai.html">Vitis AI Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="bnns.html">Relay BNNS Integration</a></li>
+<li class="toctree-l4"><a class="reference internal" href="mrvl.html">Marvell Machine Learning Integration</a></li>
 </ul>
 </li>
 <li class="toctree-l3"><a class="reference internal" href="index.html#additional-deployment-how-tos">Additional Deployment How-Tos</a></li>
diff --git a/docs/how_to/deploy/arm_compute_lib.html b/docs/how_to/deploy/arm_compute_lib.html
index 49623bd5ae..cbeea1746a 100644
--- a/docs/how_to/deploy/arm_compute_lib.html
+++ b/docs/how_to/deploy/arm_compute_lib.html
@@ -270,6 +270,7 @@
 <li class="toctree-l4"><a class="reference internal" href="tensorrt.html">Relay TensorRT Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="vitis_ai.html">Vitis AI Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="bnns.html">Relay BNNS Integration</a></li>
+<li class="toctree-l4"><a class="reference internal" href="mrvl.html">Marvell Machine Learning Integration</a></li>
 </ul>
 </li>
 <li class="toctree-l3"><a class="reference internal" href="index.html#additional-deployment-how-tos">Additional Deployment How-Tos</a></li>
diff --git a/docs/how_to/deploy/bnns.html b/docs/how_to/deploy/bnns.html
index cabbfc10ff..8f61173bd5 100644
--- a/docs/how_to/deploy/bnns.html
+++ b/docs/how_to/deploy/bnns.html
@@ -48,7 +48,7 @@
     <script type="text/javascript" src="../../_static/js/tlcpack_theme.js"></script>
     <link rel="index" title="Index" href="../../genindex.html" />
     <link rel="search" title="Search" href="../../search.html" />
-    <link rel="next" title="Deploy Deep Learning Models" href="../deploy_models/index.html" />
+    <link rel="next" title="Marvell Machine Learning Integration" href="mrvl.html" />
     <link rel="prev" title="Vitis AI Integration" href="vitis_ai.html" /> 
 </head>
 
@@ -270,6 +270,7 @@
 <li class="toctree-l4"><a class="reference internal" href="tensorrt.html">Relay TensorRT Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="vitis_ai.html">Vitis AI Integration</a></li>
 <li class="toctree-l4 current"><a class="current reference internal" href="#">Relay BNNS Integration</a></li>
+<li class="toctree-l4"><a class="reference internal" href="mrvl.html">Marvell Machine Learning Integration</a></li>
 </ul>
 </li>
 <li class="toctree-l3"><a class="reference internal" href="index.html#additional-deployment-how-tos">Additional Deployment How-Tos</a></li>
@@ -563,7 +564,7 @@ fusion</p></td>
 
     <div class="rst-footer-buttons" role="navigation" aria-label="footer navigation">
       
-        <a href="../deploy_models/index.html" class="btn btn-neutral float-right" title="Deploy Deep Learning Models" accesskey="n" rel="next">Next <span class="fa fa-arrow-circle-right"></span></a>
+        <a href="mrvl.html" class="btn btn-neutral float-right" title="Marvell Machine Learning Integration" accesskey="n" rel="next">Next <span class="fa fa-arrow-circle-right"></span></a>
       
       
         <a href="vitis_ai.html" class="btn btn-neutral float-left" title="Vitis AI Integration" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left"></span> Previous</a>
diff --git a/docs/how_to/deploy/cpp_deploy.html b/docs/how_to/deploy/cpp_deploy.html
index f1c53b8b8a..c4541ed544 100644
--- a/docs/how_to/deploy/cpp_deploy.html
+++ b/docs/how_to/deploy/cpp_deploy.html
@@ -270,6 +270,7 @@
 <li class="toctree-l4"><a class="reference internal" href="tensorrt.html">Relay TensorRT Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="vitis_ai.html">Vitis AI Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="bnns.html">Relay BNNS Integration</a></li>
+<li class="toctree-l4"><a class="reference internal" href="mrvl.html">Marvell Machine Learning Integration</a></li>
 </ul>
 </li>
 <li class="toctree-l3"><a class="reference internal" href="index.html#additional-deployment-how-tos">Additional Deployment How-Tos</a></li>
diff --git a/docs/how_to/deploy/hls.html b/docs/how_to/deploy/hls.html
index 11d82f5027..dbe0ec0b07 100644
--- a/docs/how_to/deploy/hls.html
+++ b/docs/how_to/deploy/hls.html
@@ -270,6 +270,7 @@
 <li class="toctree-l4"><a class="reference internal" href="tensorrt.html">Relay TensorRT Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="vitis_ai.html">Vitis AI Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="bnns.html">Relay BNNS Integration</a></li>
+<li class="toctree-l4"><a class="reference internal" href="mrvl.html">Marvell Machine Learning Integration</a></li>
 </ul>
 </li>
 <li class="toctree-l3"><a class="reference internal" href="index.html#additional-deployment-how-tos">Additional Deployment How-Tos</a></li>
diff --git a/docs/how_to/deploy/index.html b/docs/how_to/deploy/index.html
index debcec3d3b..31ccb9dd38 100644
--- a/docs/how_to/deploy/index.html
+++ b/docs/how_to/deploy/index.html
@@ -274,6 +274,7 @@
 <li class="toctree-l4"><a class="reference internal" href="tensorrt.html">Relay TensorRT Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="vitis_ai.html">Vitis AI Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="bnns.html">Relay BNNS Integration</a></li>
+<li class="toctree-l4"><a class="reference internal" href="mrvl.html">Marvell Machine Learning Integration</a></li>
 </ul>
 </li>
 <li class="toctree-l3"><a class="reference internal" href="#additional-deployment-how-tos">Additional Deployment How-Tos</a><ul>
@@ -594,6 +595,20 @@ target device without relying on RPC. See the following resources on how to do s
 <li class="toctree-l2"><a class="reference internal" href="bnns.html#operator-support">Operator support</a></li>
 </ul>
 </li>
+<li class="toctree-l1"><a class="reference internal" href="mrvl.html">Marvell Machine Learning Integration</a><ul>
+<li class="toctree-l2"><a class="reference internal" href="mrvl.html#introduction">1. Introduction</a></li>
+<li class="toctree-l2"><a class="reference internal" href="mrvl.html#building-tvm-with-mrvl-support">2. Building TVM with mrvl support</a></li>
+<li class="toctree-l2"><a class="reference internal" href="mrvl.html#clone-tvm-repo">2.1 Clone TVM repo</a></li>
+<li class="toctree-l2"><a class="reference internal" href="mrvl.html#build-and-start-the-tvm-mrvl-docker-container">2.2 Build and start the TVM - mrvl docker container</a></li>
+<li class="toctree-l2"><a class="reference internal" href="mrvl.html#build-tvm-inside-the-docker-container-with-mrvl-inside-tvm-directory">3. Build TVM inside the docker container with mrvl (inside tvm directory)</a></li>
+<li class="toctree-l2"><a class="reference internal" href="mrvl.html#compiling-a-model-using-tvmc-command-line">4. Compiling a model using TVMC command line</a></li>
+<li class="toctree-l2"><a class="reference internal" href="mrvl.html#tvmc-compilation-flow-for-a-model">4.1 TVMC Compilation Flow for a model</a></li>
+<li class="toctree-l2"><a class="reference internal" href="mrvl.html#tvmc-command-line-option-s-syntax-for-mrvl-target">4.2. TVMC - Command line option(s): Syntax for mrvl target</a></li>
+<li class="toctree-l2"><a class="reference internal" href="mrvl.html#tvmc-compiler-mrvl-specific-command-line-options">4.3. TVMC Compiler: mrvl specific Command Line Options</a></li>
+<li class="toctree-l2"><a class="reference internal" href="mrvl.html#compilation-generating-model-partitions">5. Compilation - Generating model partitions</a></li>
+<li class="toctree-l2"><a class="reference internal" href="mrvl.html#compiling-a-model-using-python-apis">6. Compiling a model using Python APIs</a></li>
+</ul>
+</li>
 </ul>
 </div>
 </div>
diff --git a/docs/how_to/deploy/integrate.html b/docs/how_to/deploy/integrate.html
index 4f894e6719..c8fb7882c6 100644
--- a/docs/how_to/deploy/integrate.html
+++ b/docs/how_to/deploy/integrate.html
@@ -270,6 +270,7 @@
 <li class="toctree-l4"><a class="reference internal" href="tensorrt.html">Relay TensorRT Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="vitis_ai.html">Vitis AI Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="bnns.html">Relay BNNS Integration</a></li>
+<li class="toctree-l4"><a class="reference internal" href="mrvl.html">Marvell Machine Learning Integration</a></li>
 </ul>
 </li>
 <li class="toctree-l3"><a class="reference internal" href="index.html#additional-deployment-how-tos">Additional Deployment How-Tos</a></li>
diff --git a/docs/how_to/deploy/bnns.html b/docs/how_to/deploy/mrvl.html
similarity index 54%
copy from docs/how_to/deploy/bnns.html
copy to docs/how_to/deploy/mrvl.html
index cabbfc10ff..9952212c1a 100644
--- a/docs/how_to/deploy/bnns.html
+++ b/docs/how_to/deploy/mrvl.html
@@ -11,7 +11,7 @@
   
   <meta name="viewport" content="width=device-width, initial-scale=1.0">
   
-  <title>Relay BNNS Integration &mdash; tvm 0.16.dev0 documentation</title>
+  <title>Marvell Machine Learning Integration &mdash; tvm 0.16.dev0 documentation</title>
   
 
   
@@ -49,7 +49,7 @@
     <link rel="index" title="Index" href="../../genindex.html" />
     <link rel="search" title="Search" href="../../search.html" />
     <link rel="next" title="Deploy Deep Learning Models" href="../deploy_models/index.html" />
-    <link rel="prev" title="Vitis AI Integration" href="vitis_ai.html" /> 
+    <link rel="prev" title="Relay BNNS Integration" href="bnns.html" /> 
 </head>
 
 <body class="wy-body-for-nav">
@@ -269,7 +269,8 @@
 <li class="toctree-l4"><a class="reference internal" href="arm_compute_lib.html">Relay Arm<sup>®</sup> Compute Library Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="tensorrt.html">Relay TensorRT Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="vitis_ai.html">Vitis AI Integration</a></li>
-<li class="toctree-l4 current"><a class="current reference internal" href="#">Relay BNNS Integration</a></li>
+<li class="toctree-l4"><a class="reference internal" href="bnns.html">Relay BNNS Integration</a></li>
+<li class="toctree-l4 current"><a class="current reference internal" href="#">Marvell Machine Learning Integration</a></li>
 </ul>
 </li>
 <li class="toctree-l3"><a class="reference internal" href="index.html#additional-deployment-how-tos">Additional Deployment How-Tos</a></li>
@@ -369,7 +370,7 @@
         
           <li><a href="index.html">Deploy Models and Integrate TVM</a> <span class="br-arrow">></span></li>
         
-      <li>Relay BNNS Integration</li>
+      <li>Marvell Machine Learning Integration</li>
     
     
       
@@ -382,7 +383,7 @@
         
             
             
-              <a href="https://github.com/apache/tvm/edit/main/docs/how_to/deploy/bnns.rst" class="fa fa-github"> Edit on GitHub</a>
+              <a href="https://github.com/apache/tvm/edit/main/docs/how_to/deploy/mrvl.rst" class="fa fa-github"> Edit on GitHub</a>
             
           
         
@@ -396,160 +397,184 @@
           <div role="main" class="document" itemscope="itemscope" itemtype="http://schema.org/Article">
            <div itemprop="articleBody">
             
-  <div class="section" id="relay-bnns-integration">
-<h1>Relay BNNS Integration<a class="headerlink" href="#relay-bnns-integration" title="Permalink to this headline">¶</a></h1>
-<p><strong>Author</strong>: <a class="reference external" href="https://github.com/echuraev">Egor Churaev</a></p>
+  <div class="section" id="marvell-machine-learning-integration">
+<h1>Marvell Machine Learning Integration<a class="headerlink" href="#marvell-machine-learning-integration" title="Permalink to this headline">¶</a></h1>
 <div class="section" id="introduction">
-<h2>Introduction<a class="headerlink" href="#introduction" title="Permalink to this headline">¶</a></h2>
-<p>Apple BNNS library is a collection of functions that can be used to construct neural networks
-for inference (and train). It’s supported in macOS, iOS, tvOS, and watchOS. BNNS provides
-primitives executed on all CPU supported on those platforms and optimized for high performance
-and low-energy consumption. This integration will offload as many operators as possible from Relay to BNNS.</p>
-<p>BNNS runtime is a part of platform API and available on all modern Apple operating systems.
-Application using BNNS will not depends on any additional external dependencies.</p>
-<p>BNNS functions uses Apple private hardware capabilities which are not exposed yet by Apple. Example
-of such capabilities can be AMX Apple cpu extension.</p>
-<p>This guide will demonstrate how to build TVM with BNNS codegen and runtime enabled. It will also provide example
-code to compile and run models using BNNS runtime. Finally, we document the supported operators.</p>
+<h2>1. Introduction<a class="headerlink" href="#introduction" title="Permalink to this headline">¶</a></h2>
+<p>Marvell(R) supports a family of high performance Data Processing
+Units (DPUs) with integrated compute, high speed I/O and workload
+accelerators. These workload accelerators includes Marvell’s
+Machine Learning Inference Processor (MLIP), a highly optimized,
+integrated inference engine.</p>
+<p>TVM supports Marvell’s MLIP using the “mrvl” library. This partitions and
+compiles supported operations for accelerated execution on MLIP, or LLVM
+for general compute.</p>
+<p>For runtime, the library supports native execution on MLIP hardware
+as well as Marvell’s ML simulator (mlModel).</p>
+<p>The library supports Marvell’s Octeon family of processors with ML accelarators.</p>
+<p>This guide demonstrates building TVM with codegen and
+runtime enabled. It also provides example code to compile and run
+models using ‘mrvl’ runtime.</p>
 </div>
-<div class="section" id="building-tvm-with-bnns-support">
-<h2>Building TVM with BNNS support<a class="headerlink" href="#building-tvm-with-bnns-support" title="Permalink to this headline">¶</a></h2>
-<p>To turn on TVM BNNS codegen and TVM BNNS runtime you need to turn on the only USE_BNNS flag</p>
-<ul class="simple">
-<li><p>USE_BNNS=ON/OFF - This flag will enable compiling a network with offloading subgraphs to BNNS primitives
-and will link tvm library to the BNNS runtime module.</p></li>
-</ul>
-<p>Enabling of this flag will cause to search the default Accelerate Frameworks on current target SDK.
-The minimal versions of required SDK is macOS 11.0, iOS 14.0, tvOS 14.0 and watchOS 7.0.</p>
-<p>Example setting in config.cmake file:</p>
-<div class="highlight-cmake notranslate"><div class="highlight"><pre><span></span><span class="nb">set</span><span class="p">(</span><span class="s">USE_BNNS</span><span class="w"> </span><span class="s">ON</span><span class="p">)</span>
+<div class="section" id="building-tvm-with-mrvl-support">
+<h2>2. Building TVM with mrvl support<a class="headerlink" href="#building-tvm-with-mrvl-support" title="Permalink to this headline">¶</a></h2>
+</div>
+<div class="section" id="clone-tvm-repo">
+<h2>2.1 Clone TVM repo<a class="headerlink" href="#clone-tvm-repo" title="Permalink to this headline">¶</a></h2>
+<p>Refer to the following TVM documentation for cloning TVM
+<a class="reference external" href="https://tvm.apache.org/docs/install/from_source.html">https://tvm.apache.org/docs/install/from_source.html</a></p>
+</div>
+<div class="section" id="build-and-start-the-tvm-mrvl-docker-container">
+<h2>2.2 Build and start the TVM - mrvl docker container<a class="headerlink" href="#build-and-start-the-tvm-mrvl-docker-container" title="Permalink to this headline">¶</a></h2>
+<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>./docker/build.sh<span class="w"> </span>demo_mrvl<span class="w"> </span>bash<span class="w">                              </span><span class="c1"># Build the docker container</span>
+./docker/bash.sh<span class="w"> </span>tvm.demo_mrvl<span class="w"> </span>--env<span class="w"> </span><span class="nv">PYTHONPATH</span><span class="o">=</span><span class="nv">$PWD</span>/python<span class="w">   </span><span class="c1"># Load the docker image</span>
 </pre></div>
 </div>
 </div>
-<div class="section" id="bnns-partitioning-of-relay-graph">
-<h2>BNNS partitioning of Relay graph<a class="headerlink" href="#bnns-partitioning-of-relay-graph" title="Permalink to this headline">¶</a></h2>
-<p>Operations to be offloaded on BNNS execution must be annotated before passing of module for compilation.
-All ops annotated by <cite>partition_for_bnns</cite> will be offloaded for BNNS execution. The rest of the ops
-will go through the LLVM compilation and code generation.</p>
-<p>Important note: BNNS support primitives only with constant weights. To satisfy this requirements we have
-to map constants to related tensor abstraction in relay representation. To freeze tensors and operate
-with them as constants you may need to call ONNX importer with special flag “freeze_params=True”
-or performer binding manually. In general cases all relay importers don’t do that by default.
-For your convenience “partition_for_bnns” can do this for you if params dictionary is passed as the argument.</p>
-<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">tvm.relay.op.contrib.bnns</span> <span class="kn">import</span> <span class="n">partition_for_bnns</span>
-<span class="n">model</span> <span class="o">=</span> <span class="n">partition_for_bnns</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">params</span><span class="o">=</span><span class="n">params</span><span class="p">)</span>
+<div class="section" id="build-tvm-inside-the-docker-container-with-mrvl-inside-tvm-directory">
+<h2>3. Build TVM inside the docker container with mrvl (inside tvm directory)<a class="headerlink" href="#build-tvm-inside-the-docker-container-with-mrvl-inside-tvm-directory" title="Permalink to this headline">¶</a></h2>
+<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>./tests/scripts/task_config_build_mrvl.sh<span class="w"> </span>build
+<span class="nb">cd</span><span class="w"> </span>build
+cmake<span class="w"> </span>..
+make<span class="w"> </span>-j<span class="k">$(</span>nproc<span class="k">)</span><span class="w">   </span><span class="c1"># nproc = 4/8/..  (Number of Parallel jobs)</span>
 </pre></div>
 </div>
 </div>
-<div class="section" id="input-data-layout-for-operations-to-be-offloaded-to-bnns-execution">
-<h2>Input data layout for operations to be offloaded to BNNS execution<a class="headerlink" href="#input-data-layout-for-operations-to-be-offloaded-to-bnns-execution" title="Permalink to this headline">¶</a></h2>
-<p>BNNS kernels support only planar format of input data. The partitioner will require to have NCHW input
-layout for conv2d input.</p>
-<p>To use BNNS integration for models with interleave input layout, they should be converted before
-passing of module to <cite>partition_for_bnns</cite>. The layout conversion will happen only for explicitly
-enumerated types of ops. It might happen that depending on topology there might be regular data reorder
-around conv2d to interleave and planar layout. This will be reflected in performance penalties and affect
-execution time. It is recommended to analyze the whole topology and extend below list to convert all
-intermediate tensors to NCHW data layout.</p>
-<p>Example of input layouts change:</p>
-<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># For models with NHWC input layout</span>
-<span class="k">with</span> <span class="n">tvm</span><span class="o">.</span><span class="n">transform</span><span class="o">.</span><span class="n">PassContext</span><span class="p">(</span><span class="n">opt_level</span><span class="o">=</span><span class="mi">3</span><span class="p">):</span>
-    <span class="n">mod</span> <span class="o">=</span> <span class="n">relay</span><span class="o">.</span><span class="n">transform</span><span class="o">.</span><span class="n">InferType</span><span class="p">()(</span><span class="n">mod</span><span class="p">)</span>
-    <span class="n">mod</span> <span class="o">=</span> <span class="n">relay</span><span class="o">.</span><span class="n">transform</span><span class="o">.</span><span class="n">ConvertLayout</span><span class="p">({</span><span class="s2">&quot;nn.conv2d&quot;</span><span class="p">:</span> <span class="p">[</span><span class="s2">&quot;NCHW&quot;</span><span class="p">,</span> <span class="s2">&quot;default&quot;</span><span class="p">],</span>
-                                        <span class="s2">&quot;nn.bias_add&quot;</span><span class="p">:</span> <span class="p">[</span><span class="s2">&quot;NCHW&quot;</span><span class="p">,</span> <span class="s2">&quot;default&quot;</span><span class="p">],</span>
-                                        <span class="s2">&quot;nn.relu&quot;</span><span class="p">:</span> <span class="p">[</span><span class="s2">&quot;NCHW&quot;</span><span class="p">]})(</span><span class="n">mod</span><span class="p">)</span>
+<div class="section" id="compiling-a-model-using-tvmc-command-line">
+<h2>4. Compiling a model using TVMC command line<a class="headerlink" href="#compiling-a-model-using-tvmc-command-line" title="Permalink to this headline">¶</a></h2>
+<p>Models can be compiled and run for mrvl target using TVMC
+which is optimized for performance.</p>
+<p>Refer to the following TVMC documentation, for tvmc generic options.
+<a class="reference external" href="https://tvm.apache.org/docs/tutorial/tvmc_command_line_driver.html">https://tvm.apache.org/docs/tutorial/tvmc_command_line_driver.html</a></p>
+<p>Additional mrvl-specific options may be added as attributes if
+necessary. The advanced usage is described in this document below.</p>
+</div>
+<div class="section" id="tvmc-compilation-flow-for-a-model">
+<h2>4.1 TVMC Compilation Flow for a model<a class="headerlink" href="#tvmc-compilation-flow-for-a-model" title="Permalink to this headline">¶</a></h2>
+<p>Refer to the following TVM documentation, for compilation flow
+<a class="reference external" href="https://tvm.apache.org/docs/arch/index.html#example-compilation-flow">https://tvm.apache.org/docs/arch/index.html#example-compilation-flow</a></p>
+</div>
+<div class="section" id="tvmc-command-line-option-s-syntax-for-mrvl-target">
+<h2>4.2. TVMC - Command line option(s): Syntax for mrvl target<a class="headerlink" href="#tvmc-command-line-option-s-syntax-for-mrvl-target" title="Permalink to this headline">¶</a></h2>
+<p>Compiling an ONNX model using the tvmc for mrvl target.</p>
+<p><strong>Syntax:</strong></p>
+<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">python3</span> <span class="o">-</span><span class="n">m</span> <span class="n">tvm</span><span class="o">.</span><span class="n">driver</span><span class="o">.</span><span class="n">tvmc</span> <span class="nb">compile</span> <span class="o">--</span><span class="n">target</span><span class="o">=</span><span class="s2">&quot;mrvl, llvm&quot;</span>
+    <span class="o">--</span><span class="n">target</span><span class="o">-</span><span class="n">llvm</span><span class="o">-&lt;</span><span class="n">options</span><span class="o">&gt;</span>
+    <span class="o">--</span><span class="n">target</span><span class="o">-</span><span class="n">mrvl</span><span class="o">-&lt;</span><span class="n">options</span><span class="o">&gt;</span>
+    <span class="o">--&lt;</span><span class="n">tvm</span><span class="o">-</span><span class="n">generic</span><span class="o">-</span><span class="n">options</span><span class="o">&gt;</span>
+    <span class="n">model_file</span><span class="o">.</span><span class="n">onnx</span>
 </pre></div>
 </div>
+<p>Following is an example TVMC Compile command for an ARMv9 core and
+integrated MLIP cn10ka processor, using only 4 tiles in the block.</p>
+<p><strong>Example:</strong></p>
+<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">python3</span> <span class="o">-</span><span class="n">m</span> <span class="n">tvm</span><span class="o">.</span><span class="n">driver</span><span class="o">.</span><span class="n">tvmc</span> <span class="nb">compile</span> <span class="o">--</span><span class="n">target</span><span class="o">=</span><span class="s2">&quot;mrvl, llvm&quot;</span> \
+    <span class="o">--</span><span class="n">target</span><span class="o">-</span><span class="n">llvm</span><span class="o">-</span><span class="n">mtriple</span><span class="o">=</span><span class="n">aarch64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span> <span class="o">--</span><span class="n">target</span><span class="o">-</span><span class="n">llvm</span><span class="o">-</span><span class="n">mcpu</span><span class="o"> [...]
+    <span class="o">--</span><span class="n">target</span><span class="o">-</span><span class="n">mrvl</span><span class="o">-</span><span class="n">num_tiles</span><span class="o">=</span><span class="mi">4</span> \
+    <span class="o">--</span><span class="n">cross</span><span class="o">-</span><span class="n">compiler</span> <span class="n">aarch64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">-</span><span class="n">gcc</span> \
+    <span class="o">--</span><span class="n">output</span> <span class="n">model</span><span class="o">.</span><span class="n">tar</span> \
+    <span class="n">mnist</span><span class="o">-</span><span class="mf">12.</span><span class="n">onnx</span>
+</pre></div>
 </div>
-<div class="section" id="example-build-and-deploy-mobilenet-v2-1-0-with-bnns">
-<h2>Example: Build and Deploy Mobilenet v2 1.0 with BNNS<a class="headerlink" href="#example-build-and-deploy-mobilenet-v2-1-0-with-bnns" title="Permalink to this headline">¶</a></h2>
-<p>Create a Relay graph from a MXNet Mobilenet v2 1.0 model.</p>
-<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">tvm</span>
-<span class="kn">from</span> <span class="nn">tvm</span> <span class="kn">import</span> <span class="n">relay</span>
-<span class="kn">import</span> <span class="nn">mxnet</span>
-<span class="kn">from</span> <span class="nn">mxnet.gluon.model_zoo.vision</span> <span class="kn">import</span> <span class="n">get_model</span>
-
-<span class="n">dtype</span> <span class="o">=</span> <span class="s2">&quot;float32&quot;</span>
-<span class="n">input_shape</span> <span class="o">=</span> <span class="p">(</span><span class="mi">1</span><span class="p">,</span> <span class="mi">3</span><span class="p">,</span> <span class="mi">224</span><span class="p">,</span> <span class="mi">224</span><span class="p">)</span>
-<span class="n">block</span> <span class="o">=</span> <span class="n">get_model</span><span class="p">(</span><span class="s1">&#39;mobilenetv2_1.0&#39;</span><span class="p">,</span> <span class="n">pretrained</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
-<span class="n">module</span><span class="p">,</span> <span class="n">params</span> <span class="o">=</span> <span class="n">relay</span><span class="o">.</span><span class="n">frontend</span><span class="o">.</span><span class="n">from_mxnet</span><span class="p">(</span><span class="n">block</span><span class="p">,</span> <span class="n">shape</span><span class="o">=</span><span class="p">{</span><span class="s1">&#39;data&#39;</span><span class="p">:</span> <span class="n">input_shape [...]
+</div>
+<div class="section" id="tvmc-compiler-mrvl-specific-command-line-options">
+<h2>4.3. TVMC Compiler: mrvl specific Command Line Options<a class="headerlink" href="#tvmc-compiler-mrvl-specific-command-line-options" title="Permalink to this headline">¶</a></h2>
+<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="o">--</span><span class="n">target</span><span class="o">-</span><span class="n">mrvl</span><span class="o">-</span><span class="n">mcpu</span>
+<span class="o">--</span><span class="n">target</span><span class="o">-</span><span class="n">mrvl</span><span class="o">-</span><span class="n">num_tiles</span>
+<span class="o">--</span><span class="n">target</span><span class="o">-</span><span class="n">mrvl</span><span class="o">-</span><span class="n">mattr</span>
 </pre></div>
 </div>
-<p>Markup the parts of graphs to be offloaded to BNNS primitives. All ops which are supported by the BNNS
-integration will be handled by BNNS invocations, the rest of the ops will go through the
-regular TVM llvm compilation and code generation.</p>
-<p>After that you need to compile new module with target corresponding to required Apple platform</p>
-<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">tvm.relay.op.contrib.bnns</span> <span class="kn">import</span> <span class="n">partition_for_bnns</span>
-
-<span class="c1"># target for macOS Big Sur 11.1:</span>
-<span class="n">target</span> <span class="o">=</span> <span class="s2">&quot;llvm -mtriple=x86_64-apple-darwin20.2.0&quot;</span>
-
-<span class="n">model</span> <span class="o">=</span> <span class="n">partition_for_bnns</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">params</span><span class="o">=</span><span class="n">params</span><span class="p">)</span>  <span class="c1"># to markup operations to be offloaded to BNNS</span>
-<span class="k">with</span> <span class="n">tvm</span><span class="o">.</span><span class="n">transform</span><span class="o">.</span><span class="n">PassContext</span><span class="p">(</span><span class="n">opt_level</span><span class="o">=</span><span class="mi">3</span><span class="p">):</span>
-    <span class="n">lib</span> <span class="o">=</span> <span class="n">relay</span><span class="o">.</span><span class="n">build</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">target</span><span class="o">=</span><span class="n">target</span><span class="p">,</span> <span class="n">params</span><span class="o">=</span><span class="n">params</span><span class="p">)</span>
+<p><strong>Description of mrvl options</strong></p>
+<ul>
+<li><dl class="simple">
+<dt>mcpu:</dt><dd><p>The CPU class of Marvell(R) ML Inference Processor;
+possible values = {cn10ka, cnf10kb}; defaults to cn10ka</p>
+</dd>
+</dl>
+</li>
+<li><dl class="simple">
+<dt>num_tiles:</dt><dd><p>Maximum number of tiles that may be used, possible values = {1,2,4,8}, defaults to 8</p>
+</dd>
+</dl>
+</li>
+<li><dl>
+<dt>mattr:</dt><dd><p>Attributes for mrvl; possible values = {quantize, wb_pin_ocm}</p>
+<p>mattr specifies the data type, code generation options and optimizations.</p>
+<p><em>List of supported attributes are:</em></p>
+<p><strong>1. quantize</strong></p>
+<p>Specify the data type. Possible values = {fp16, int8}.
+Default is fp16, int8 is WIP and full support will be added in a future PR.</p>
+<p><strong>2. wb_pin_ocm</strong></p>
+<p>Optimize runtime by preloading a model’s weights and bias into
+the on chip memory. Possible values = {0, 1}. Default is 0 (no preload)</p>
+</dd>
+</dl>
+</li>
+</ul>
+</div>
+<div class="section" id="compilation-generating-model-partitions">
+<h2>5. Compilation - Generating model partitions<a class="headerlink" href="#compilation-generating-model-partitions" title="Permalink to this headline">¶</a></h2>
+<p>In the TVMC mrvl flow, the model is partitioned into Marvell and LLVM regions.
+Building each partitioned Marvell subgraph generates serialized nodes.json and
+const.json. Partitioned nodes.json is the representation of the model graph which is
+suitable for the Marvell mmlc compiler. It is distributed separately via CDK</p>
+<p><strong>Model Partition</strong></p>
+<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>python3<span class="w"> </span>-m<span class="w"> </span>tvm.driver.tvmc<span class="w"> </span>compile<span class="w"> </span>--target<span class="o">=</span><span class="s2">&quot;mrvl, llvm \</span>
+<span class="s2">-mtriple=aarch64-linux-gnu -mcpu=neoverse-n2&quot;</span><span class="w"> </span><span class="se">\</span>
+--cross-compiler<span class="w"> </span>aarch64-linux-gnu-gcc<span class="w"> </span><span class="se">\</span>
+--target-mrvl-num_tiles<span class="o">=</span><span class="m">4</span><span class="w"> </span>--output<span class="w"> </span>model.tar<span class="w"> </span>model.onnx
 </pre></div>
 </div>
-<p>Export the module.</p>
-<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">lib</span><span class="o">.</span><span class="n">export_library</span><span class="p">(</span><span class="s1">&#39;compiled.dylib&#39;</span><span class="p">)</span>
+</div>
+<div class="section" id="compiling-a-model-using-python-apis">
+<h2>6. Compiling a model using Python APIs<a class="headerlink" href="#compiling-a-model-using-python-apis" title="Permalink to this headline">¶</a></h2>
+<p>In addition to using TVMC, models can also be compiled and run using
+TVM Python API. Below is an example to compile the MNIST model. Support
+to run the model will be part of next PR by mrvl</p>
+<p><strong>Download MNIST model from the web</strong></p>
+<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span><span class="nb">cd</span><span class="w"> </span><span class="nv">$HOME</span>
+wget<span class="w"> </span>https://github.com/onnx/models/raw/main/validated/vision/classification/mnist/model/mnist-12.onnx
 </pre></div>
 </div>
-<p>Load module and run inference on the target machine with TVM  built with <code class="docutils literal notranslate"><span class="pre">USE_BNNS</span></code> enabled</p>
-<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">tvm</span>
+<p><strong>Import the TVM and other dependent modules</strong></p>
+<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">tvm</span><span class="o">,</span> <span class="nn">onnx</span><span class="o">,</span> <span class="nn">os</span>
 <span class="kn">import</span> <span class="nn">numpy</span> <span class="k">as</span> <span class="nn">np</span>
-<span class="kn">from</span> <span class="nn">tvm.contrib</span> <span class="kn">import</span> <span class="n">graph_executor</span>
+<span class="kn">import</span> <span class="nn">tvm.relay</span> <span class="k">as</span> <span class="nn">relay</span>
+<span class="kn">from</span> <span class="nn">tvm.relay.op.contrib.mrvl</span> <span class="kn">import</span> <span class="n">partition_for_mrvl</span>
+<span class="kn">from</span> <span class="nn">tvm.relay.build_module</span> <span class="kn">import</span> <span class="n">build</span>
+<span class="kn">from</span> <span class="nn">keras.datasets</span> <span class="kn">import</span> <span class="n">mnist</span>
+</pre></div>
+</div>
+<p><strong>Load model onnx file</strong></p>
+<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">onnx_model</span> <span class="o">=</span> <span class="n">onnx</span><span class="o">.</span><span class="n">load</span><span class="p">(</span><span class="s2">&quot;mnist-12.onnx&quot;</span><span class="p">)</span>
+</pre></div>
+</div>
+<p><strong>Create a Relay graph from MNIST model</strong></p>
+<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">shape_dict</span> <span class="o">=</span> <span class="p">{</span><span class="s1">&#39;Input3&#39;</span> <span class="p">:</span> <span class="p">(</span><span class="mi">1</span><span class="p">,</span><span class="mi">1</span><span class="p">,</span><span class="mi">28</span><span class="p">,</span><span class="mi">28</span><span class="p">)}</span>
+<span class="n">mod</span><span class="p">,</span> <span class="n">params</span> <span class="o">=</span> <span class="n">relay</span><span class="o">.</span><span class="n">frontend</span><span class="o">.</span><span class="n">from_onnx</span><span class="p">(</span><span class="n">onnx_model</span><span class="p">,</span> <span class="n">shape_dict</span><span class="p">)</span>
+</pre></div>
+</div>
+<p><strong>Define option dictionary and Partition the Model</strong></p>
+<p>Annotate and partition the graph for mrvl. All operations which are supported
+by the mrvl will be marked and offloaded to mrvl hardware accelerator. The rest of the
+operations will go through the regular LLVM compilation and code generation for ARM.</p>
+<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">tvm_target</span> <span class="o">=</span> <span class="s2">&quot;llvm&quot;</span>
 
-<span class="n">dev</span> <span class="o">=</span> <span class="n">tvm</span><span class="o">.</span><span class="n">cpu</span><span class="p">(</span><span class="mi">0</span><span class="p">)</span>
-<span class="n">loaded_lib</span> <span class="o">=</span> <span class="n">tvm</span><span class="o">.</span><span class="n">runtime</span><span class="o">.</span><span class="n">load_module</span><span class="p">(</span><span class="s1">&#39;compiled.dylib&#39;</span><span class="p">)</span>
-<span class="n">gen_module</span> <span class="o">=</span> <span class="n">tvm</span><span class="o">.</span><span class="n">contrib</span><span class="o">.</span><span class="n">graph_executor</span><span class="o">.</span><span class="n">GraphModule</span><span class="p">(</span><span class="n">loaded_lib</span><span class="p">[</span><span class="s1">&#39;default&#39;</span><span class="p">](</span><span class="n">dev</span><span class="p">))</span>
+<span class="n">option_dict</span> <span class="o">=</span> <span class="p">{</span><span class="s1">&#39;num_tiles&#39;</span><span class="p">:</span> <span class="mi">4</span><span class="p">}</span>
 
-<span class="n">dtype</span> <span class="o">=</span> <span class="s2">&quot;float32&quot;</span>
-<span class="n">input_shape</span> <span class="o">=</span> <span class="p">(</span><span class="mi">1</span><span class="p">,</span> <span class="mi">3</span><span class="p">,</span> <span class="mi">224</span><span class="p">,</span> <span class="mi">224</span><span class="p">)</span>
-<span class="n">input_data</span> <span class="o">=</span> <span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">uniform</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="n">input_shape</span><span class="p">)</span><span class="o">.</span><span class="n">astype</span><span class="p">(</span><span class="n">dtype</span><span class="p [...]
-<span class="n">gen_module</span><span class="o">.</span><span class="n">run</span><span class="p">(</span><span class="n">data</span><span class="o">=</span><span class="n">input_data</span><span class="p">)</span>
+<span class="n">mod</span> <span class="o">=</span> <span class="n">partition_for_mrvl</span><span class="p">(</span><span class="n">mod</span><span class="p">,</span> <span class="n">params</span><span class="p">,</span> <span class="o">**</span><span class="n">option_dict</span><span class="p">)</span>
 </pre></div>
 </div>
+<p><strong>Build the Relay Graph</strong></p>
+<p>Build the Relay graph, using the new module returned by partition_for_mrvl.
+The target must always be a LLVM (ARM) target. <code class="docutils literal notranslate"><span class="pre">partition_for_mrvl</span></code> will
+pass the options from dictionary into the config parameters needed by the
+compiler backend, so there is no need to modify it - just pass it along
+to the PassContext so the values can be read during compilation.</p>
+<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">tvm</span><span class="o">.</span><span class="n">transform</span><span class="o">.</span><span class="n">PassContext</span><span class="p">(</span><span class="n">opt_level</span><span class="o">=</span><span class="mi">3</span><span class="p">,</span> <span class="n">config</span><span class="o">=</span><span class="p">{</span><span class="s2">&quot;relay.ext.m [...]
+        <span class="n">model_lib</span> <span class="o">=</span> <span class="n">relay</span><span class="o">.</span><span class="n">build</span><span class="p">(</span><span class="n">mod</span><span class="p">,</span> <span class="n">tvm_target</span><span class="p">,</span> <span class="n">params</span><span class="o">=</span><span class="n">params</span><span class="p">)</span>
+</pre></div>
 </div>
-<div class="section" id="operator-support">
-<h2>Operator support<a class="headerlink" href="#operator-support" title="Permalink to this headline">¶</a></h2>
-<table class="docutils align-default">
-<colgroup>
-<col style="width: 24%" />
-<col style="width: 76%" />
-</colgroup>
-<thead>
-<tr class="row-odd"><th class="head"><p>Relay Node</p></th>
-<th class="head"><p>Remarks</p></th>
-</tr>
-</thead>
-<tbody>
-<tr class="row-even"><td><p>nn.conv2d</p></td>
-<td></td>
-</tr>
-<tr class="row-odd"><td><p>nn.batch_norm</p></td>
-<td><p>Supported by BNNS integration only in nn.conv2d-batch_norm pattern</p></td>
-</tr>
-<tr class="row-even"><td><p>nn.dense</p></td>
-<td></td>
-</tr>
-<tr class="row-odd"><td><p>nn.batch_matmul</p></td>
-<td></td>
-</tr>
-<tr class="row-even"><td><p>nn.bias_add</p></td>
-<td><p>Supported by BNNS integration only as a bias part of nn.conv2d or nn.dense
-fusion</p></td>
-</tr>
-<tr class="row-odd"><td><p>add</p></td>
-<td><p>Supported by BNNS integration only as a bias part of nn.conv2d or nn.dense
-fusion</p></td>
-</tr>
-<tr class="row-even"><td><p>nn.relu</p></td>
-<td><p>Supported by BNNS integration only as a part of nn.conv2d or nn.dense fusion</p></td>
-</tr>
-<tr class="row-odd"><td><p>nn.gelu</p></td>
-<td><p>Supported by BNNS integration only as a part of nn.conv2d or nn.dense fusion</p></td>
-</tr>
-</tbody>
-</table>
 </div>
 </div>
 
@@ -566,7 +591,7 @@ fusion</p></td>
         <a href="../deploy_models/index.html" class="btn btn-neutral float-right" title="Deploy Deep Learning Models" accesskey="n" rel="next">Next <span class="fa fa-arrow-circle-right"></span></a>
       
       
-        <a href="vitis_ai.html" class="btn btn-neutral float-left" title="Vitis AI Integration" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left"></span> Previous</a>
+        <a href="bnns.html" class="btn btn-neutral float-left" title="Relay BNNS Integration" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left"></span> Previous</a>
       
     </div>
 
diff --git a/docs/how_to/deploy/tensorrt.html b/docs/how_to/deploy/tensorrt.html
index 1c5565f26b..b3d95212a0 100644
--- a/docs/how_to/deploy/tensorrt.html
+++ b/docs/how_to/deploy/tensorrt.html
@@ -270,6 +270,7 @@
 <li class="toctree-l4 current"><a class="current reference internal" href="#">Relay TensorRT Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="vitis_ai.html">Vitis AI Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="bnns.html">Relay BNNS Integration</a></li>
+<li class="toctree-l4"><a class="reference internal" href="mrvl.html">Marvell Machine Learning Integration</a></li>
 </ul>
 </li>
 <li class="toctree-l3"><a class="reference internal" href="index.html#additional-deployment-how-tos">Additional Deployment How-Tos</a></li>
diff --git a/docs/how_to/deploy/vitis_ai.html b/docs/how_to/deploy/vitis_ai.html
index e974937bf0..d29fefef84 100644
--- a/docs/how_to/deploy/vitis_ai.html
+++ b/docs/how_to/deploy/vitis_ai.html
@@ -270,6 +270,7 @@
 <li class="toctree-l4"><a class="reference internal" href="tensorrt.html">Relay TensorRT Integration</a></li>
 <li class="toctree-l4 current"><a class="current reference internal" href="#">Vitis AI Integration</a></li>
 <li class="toctree-l4"><a class="reference internal" href="bnns.html">Relay BNNS Integration</a></li>
+<li class="toctree-l4"><a class="reference internal" href="mrvl.html">Marvell Machine Learning Integration</a></li>
 </ul>
 </li>
 <li class="toctree-l3"><a class="reference internal" href="index.html#additional-deployment-how-tos">Additional Deployment How-Tos</a></li>
diff --git a/docs/how_to/deploy_models/deploy_model_on_adreno.html b/docs/how_to/deploy_models/deploy_model_on_adreno.html
index 687d317d34..62ad77c38f 100644
--- a/docs/how_to/deploy_models/deploy_model_on_adreno.html
+++ b/docs/how_to/deploy_models/deploy_model_on_adreno.html
@@ -850,10 +850,10 @@ Top5 predictions:
 Evaluate inference time cost...
 Execution time summary:
  mean (ms)   median (ms)    max (ms)     min (ms)     std (ms)
- 3997.9147    3995.0318    4020.8693    3993.1790      7.8819
+ 3993.9828    3989.9656    4021.5015    3988.6683      9.4269
 </pre></div>
 </div>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  17.688 seconds)</p>
+<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  16.408 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-deploy-models-deploy-model-on-adreno-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../../_downloads/2387d8448da213eb625e6b3d916327d4/deploy_model_on_adreno.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">deploy_model_on_adreno.py</span></code></a></p>
diff --git a/docs/how_to/deploy_models/deploy_model_on_adreno_tvmc.html b/docs/how_to/deploy_models/deploy_model_on_adreno_tvmc.html
index 8c3b1cc62b..17bde42613 100644
--- a/docs/how_to/deploy_models/deploy_model_on_adreno_tvmc.html
+++ b/docs/how_to/deploy_models/deploy_model_on_adreno_tvmc.html
@@ -458,25 +458,28 @@ to run this tutorial with a real device over rpc.</p>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/resnet/resnet50_weights_tf_dim_ordering_tf_kernels.h5
 
      8192/102967424 [..............................] - ETA: 0s
-  4653056/102967424 [&gt;.............................] - ETA: 1s
+  1572864/102967424 [..............................] - ETA: 3s
+  2785280/102967424 [..............................] - ETA: 4s
   8380416/102967424 [=&gt;............................] - ETA: 2s
- 15024128/102967424 [===&gt;..........................] - ETA: 1s
- 16769024/102967424 [===&gt;..........................] - ETA: 1s
+ 16769024/102967424 [===&gt;..........................] - ETA: 2s
+ 16908288/102967424 [===&gt;..........................] - ETA: 2s
  23412736/102967424 [=====&gt;........................] - ETA: 1s
  25157632/102967424 [======&gt;.......................] - ETA: 1s
  33546240/102967424 [========&gt;.....................] - ETA: 1s
- 41934848/102967424 [===========&gt;..................] - ETA: 0s
- 48578560/102967424 [=============&gt;................] - ETA: 0s
- 50323456/102967424 [=============&gt;................] - ETA: 0s
- 52518912/102967424 [==============&gt;...............] - ETA: 0s
- 58728448/102967424 [================&gt;.............] - ETA: 0s
+ 41934848/102967424 [===========&gt;..................] - ETA: 1s
+ 50323456/102967424 [=============&gt;................] - ETA: 1s
+ 58712064/102967424 [================&gt;.............] - ETA: 0s
  67100672/102967424 [==================&gt;...........] - ETA: 0s
+ 69296128/102967424 [===================&gt;..........] - ETA: 0s
+ 72753152/102967424 [====================&gt;.........] - ETA: 0s
+ 73744384/102967424 [====================&gt;.........] - ETA: 0s
  75489280/102967424 [====================&gt;.........] - ETA: 0s
  83877888/102967424 [=======================&gt;......] - ETA: 0s
- 92266496/102967424 [=========================&gt;....] - ETA: 0s
+ 90521600/102967424 [=========================&gt;....] - ETA: 0s
+ 92405760/102967424 [=========================&gt;....] - ETA: 0s
+100368384/102967424 [============================&gt;.] - ETA: 0s
 100646912/102967424 [============================&gt;.] - ETA: 0s
-102850560/102967424 [============================&gt;.] - ETA: 0s
-102967424/102967424 [==============================] - 1s 0us/step
+102967424/102967424 [==============================] - 2s 0us/step
 </pre></div>
 </div>
 </div>
diff --git a/docs/how_to/deploy_models/deploy_model_on_android.html b/docs/how_to/deploy_models/deploy_model_on_android.html
index 616b8d111b..920f8260f3 100644
--- a/docs/how_to/deploy_models/deploy_model_on_android.html
+++ b/docs/how_to/deploy_models/deploy_model_on_android.html
@@ -682,7 +682,7 @@ to the remote android device.</p>
 Evaluate inference time cost...
 Execution time summary:
  mean (ms)   median (ms)    max (ms)     min (ms)     std (ms)
-  14.6827      14.6408      14.8761      14.5288       0.1319
+  14.2411      14.0100      14.7648      13.7551       0.3886
 </pre></div>
 </div>
 </div>
diff --git a/docs/how_to/deploy_models/deploy_object_detection_pytorch.html b/docs/how_to/deploy_models/deploy_object_detection_pytorch.html
index 4cfd3a214e..decc5d2bbc 100644
--- a/docs/how_to/deploy_models/deploy_object_detection_pytorch.html
+++ b/docs/how_to/deploy_models/deploy_object_detection_pytorch.html
@@ -474,33 +474,39 @@ be unstable.</p>
 Downloading: &quot;https://download.pytorch.org/models/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth&quot; to /workspace/.cache/torch/hub/checkpoints/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth
 
   0%|          | 0.00/170M [00:00&lt;?, ?B/s]
-  4%|4         | 7.47M/170M [00:00&lt;00:02, 78.3MB/s]
-  9%|8         | 14.9M/170M [00:00&lt;00:05, 31.6MB/s]
- 11%|#1        | 19.3M/170M [00:00&lt;00:05, 28.1MB/s]
- 14%|#4        | 24.0M/170M [00:00&lt;00:04, 31.8MB/s]
- 19%|#8        | 32.0M/170M [00:00&lt;00:04, 35.6MB/s]
- 24%|##3       | 40.0M/170M [00:01&lt;00:03, 42.0MB/s]
- 28%|##8       | 48.0M/170M [00:01&lt;00:03, 39.5MB/s]
- 33%|###2      | 56.0M/170M [00:01&lt;00:02, 42.7MB/s]
- 37%|###6      | 62.3M/170M [00:01&lt;00:02, 45.8MB/s]
- 40%|###9      | 67.3M/170M [00:01&lt;00:02, 47.2MB/s]
- 42%|####2     | 72.1M/170M [00:01&lt;00:02, 43.8MB/s]
- 47%|####7     | 80.0M/170M [00:02&lt;00:02, 45.9MB/s]
- 52%|#####1    | 88.0M/170M [00:02&lt;00:01, 44.8MB/s]
- 56%|#####5    | 94.3M/170M [00:02&lt;00:01, 46.9MB/s]
- 58%|#####8    | 98.9M/170M [00:02&lt;00:01, 39.8MB/s]
- 61%|######1   | 104M/170M [00:02&lt;00:01, 39.8MB/s]
- 66%|######5   | 112M/170M [00:02&lt;00:01, 48.0MB/s]
- 71%|#######   | 120M/170M [00:02&lt;00:00, 52.8MB/s]
- 74%|#######4  | 126M/170M [00:03&lt;00:00, 55.8MB/s]
- 78%|#######7  | 132M/170M [00:03&lt;00:00, 51.7MB/s]
- 81%|########  | 137M/170M [00:03&lt;00:00, 41.4MB/s]
- 85%|########4 | 144M/170M [00:03&lt;00:00, 45.5MB/s]
- 88%|########8 | 150M/170M [00:03&lt;00:00, 47.6MB/s]
- 91%|#########1| 155M/170M [00:03&lt;00:00, 37.7MB/s]
- 94%|#########4| 160M/170M [00:03&lt;00:00, 37.2MB/s]
- 98%|#########7| 166M/170M [00:04&lt;00:00, 33.9MB/s]
-100%|##########| 170M/170M [00:04&lt;00:00, 41.5MB/s]
+  4%|3         | 6.30M/170M [00:00&lt;00:04, 41.1MB/s]
+  6%|6         | 10.2M/170M [00:00&lt;00:05, 32.4MB/s]
+  8%|8         | 14.3M/170M [00:00&lt;00:07, 21.0MB/s]
+ 10%|9         | 16.6M/170M [00:00&lt;00:07, 20.5MB/s]
+ 14%|#4        | 24.0M/170M [00:00&lt;00:05, 27.7MB/s]
+ 19%|#8        | 32.0M/170M [00:01&lt;00:04, 34.6MB/s]
+ 24%|##3       | 40.0M/170M [00:01&lt;00:03, 42.2MB/s]
+ 27%|##7       | 46.3M/170M [00:01&lt;00:03, 40.2MB/s]
+ 30%|##9       | 50.3M/170M [00:01&lt;00:03, 40.2MB/s]
+ 33%|###2      | 56.0M/170M [00:01&lt;00:02, 40.9MB/s]
+ 37%|###6      | 62.3M/170M [00:01&lt;00:02, 44.6MB/s]
+ 39%|###9      | 66.7M/170M [00:01&lt;00:02, 40.1MB/s]
+ 42%|####2     | 72.0M/170M [00:02&lt;00:02, 42.7MB/s]
+ 46%|####6     | 78.3M/170M [00:02&lt;00:02, 41.2MB/s]
+ 48%|####8     | 82.3M/170M [00:02&lt;00:02, 34.5MB/s]
+ 51%|#####     | 86.3M/170M [00:02&lt;00:02, 35.2MB/s]
+ 53%|#####2    | 89.8M/170M [00:02&lt;00:02, 32.6MB/s]
+ 56%|#####5    | 94.3M/170M [00:02&lt;00:02, 29.5MB/s]
+ 57%|#####7    | 97.2M/170M [00:03&lt;00:02, 27.0MB/s]
+ 60%|######    | 102M/170M [00:03&lt;00:02, 28.2MB/s]
+ 62%|######1   | 105M/170M [00:03&lt;00:02, 22.9MB/s]
+ 66%|######5   | 112M/170M [00:03&lt;00:01, 31.9MB/s]
+ 71%|#######   | 120M/170M [00:03&lt;00:01, 40.2MB/s]
+ 74%|#######4  | 126M/170M [00:03&lt;00:01, 43.6MB/s]
+ 77%|#######7  | 131M/170M [00:03&lt;00:00, 41.5MB/s]
+ 80%|########  | 136M/170M [00:04&lt;00:01, 34.4MB/s]
+ 85%|########4 | 144M/170M [00:04&lt;00:00, 42.0MB/s]
+ 88%|########8 | 150M/170M [00:04&lt;00:00, 46.7MB/s]
+ 91%|#########1| 155M/170M [00:04&lt;00:00, 43.2MB/s]
+ 94%|#########3| 160M/170M [00:04&lt;00:00, 34.9MB/s]
+ 96%|#########6| 163M/170M [00:04&lt;00:00, 33.6MB/s]
+ 98%|#########8| 167M/170M [00:05&lt;00:00, 28.7MB/s]
+100%|##########| 170M/170M [00:05&lt;00:00, 34.8MB/s]
 /venv/apache-tvm-py3.8/lib/python3.8/site-packages/torch/nn/functional.py:3912: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
   (torch.floor((input.size(i + 2).float() * torch.tensor(scale_factors[i], dtype=torch.float32)).float()))
 /venv/apache-tvm-py3.8/lib/python3.8/site-packages/torchvision/ops/boxes.py:157: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
@@ -591,7 +597,7 @@ torchvision rcnn models.</p>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Get 9 valid boxes
 </pre></div>
 </div>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 3 minutes  30.916 seconds)</p>
+<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 3 minutes  20.110 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-deploy-models-deploy-object-detection-pytorch-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../../_downloads/7795da4b258c8feff986668b95ef57ad/deploy_object_detection_pytorch.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">deploy_object_detection_pytorch.py</span></code></a></p>
diff --git a/docs/how_to/deploy_models/deploy_prequantized.html b/docs/how_to/deploy_models/deploy_prequantized.html
index f3fba57fd2..90c9fd2c13 100644
--- a/docs/how_to/deploy_models/deploy_prequantized.html
+++ b/docs/how_to/deploy_models/deploy_prequantized.html
@@ -515,9 +515,9 @@ training. Other models require a full post training calibration.</p>
 Downloading: &quot;https://download.pytorch.org/models/mobilenet_v2-b0353104.pth&quot; to /workspace/.cache/torch/hub/checkpoints/mobilenet_v2-b0353104.pth
 
   0%|          | 0.00/13.6M [00:00&lt;?, ?B/s]
- 59%|#####8    | 7.99M/13.6M [00:00&lt;00:00, 29.7MB/s]
- 90%|########9 | 12.2M/13.6M [00:00&lt;00:00, 33.5MB/s]
-100%|##########| 13.6M/13.6M [00:00&lt;00:00, 35.9MB/s]
+ 47%|####6     | 6.30M/13.6M [00:00&lt;00:00, 39.8MB/s]
+ 75%|#######4  | 10.1M/13.6M [00:00&lt;00:00, 38.4MB/s]
+100%|##########| 13.6M/13.6M [00:00&lt;00:00, 33.4MB/s]
 </pre></div>
 </div>
 </div>
@@ -608,7 +608,7 @@ output values are identical out of 1000 outputs from mobilenet v2.</p>
 </div>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Execution time summary:
  mean (ms)   median (ms)    max (ms)     min (ms)     std (ms)
-  86.1134      86.0590      87.1695      85.8505       0.2320
+  86.0615      86.0674      90.4477      85.4856       0.5247
 </pre></div>
 </div>
 <div class="admonition note">
@@ -647,7 +647,7 @@ This includes support for the VNNI 8 bit dot product instruction (CascadeLake or
 <div class="section" id="deploy-a-quantized-tflite-model">
 <h2>Deploy a quantized TFLite Model<a class="headerlink" href="#deploy-a-quantized-tflite-model" title="Permalink to this headline">¶</a></h2>
 <p>TODO</p>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  27.347 seconds)</p>
+<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  24.367 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-deploy-models-deploy-prequantized-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../../_downloads/fb8217c13f4351224c6cf3aacf1a87fc/deploy_prequantized.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">deploy_prequantized.py</span></code></a></p>
diff --git a/docs/how_to/deploy_models/deploy_prequantized_tflite.html b/docs/how_to/deploy_models/deploy_prequantized_tflite.html
index 2aeed8c73d..35b43ef30e 100644
--- a/docs/how_to/deploy_models/deploy_prequantized_tflite.html
+++ b/docs/how_to/deploy_models/deploy_prequantized_tflite.html
@@ -600,7 +600,7 @@ TFLite Top-5 labels: [387 102 386 341 349]
 </div>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Execution time summary:
  mean (ms)   median (ms)    max (ms)     min (ms)     std (ms)
-  107.8047     107.4867     137.6822     106.9999      3.0148
+  103.5347     103.3730     105.4180     102.4175      0.7573
 </pre></div>
 </div>
 <div class="admonition note">
diff --git a/docs/how_to/deploy_models/index.html b/docs/how_to/deploy_models/index.html
index 55d5af8e16..45bc47f6f6 100644
--- a/docs/how_to/deploy_models/index.html
+++ b/docs/how_to/deploy_models/index.html
@@ -49,7 +49,7 @@
     <link rel="index" title="Index" href="../../genindex.html" />
     <link rel="search" title="Search" href="../../search.html" />
     <link rel="next" title="Deploy the Pretrained Model on Adreno™" href="deploy_model_on_adreno.html" />
-    <link rel="prev" title="Relay BNNS Integration" href="../deploy/bnns.html" /> 
+    <link rel="prev" title="Marvell Machine Learning Integration" href="../deploy/mrvl.html" /> 
 </head>
 
 <body class="wy-body-for-nav">
@@ -438,7 +438,7 @@ backends.</p>
         <a href="deploy_model_on_adreno.html" class="btn btn-neutral float-right" title="Deploy the Pretrained Model on Adreno™" accesskey="n" rel="next">Next <span class="fa fa-arrow-circle-right"></span></a>
       
       
-        <a href="../deploy/bnns.html" class="btn btn-neutral float-left" title="Relay BNNS Integration" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left"></span> Previous</a>
+        <a href="../deploy/mrvl.html" class="btn btn-neutral float-left" title="Marvell Machine Learning Integration" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left"></span> Previous</a>
       
     </div>
 
diff --git a/docs/how_to/deploy_models/sg_execution_times.html b/docs/how_to/deploy_models/sg_execution_times.html
index b47b9c863f..4bb39f55bf 100644
--- a/docs/how_to/deploy_models/sg_execution_times.html
+++ b/docs/how_to/deploy_models/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-how-to-deploy-models-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>09:41.173</strong> total execution time for <strong>how_to_deploy_models</strong> files:</p>
+<p><strong>09:19.801</strong> total execution time for <strong>how_to_deploy_models</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 86%" />
@@ -369,39 +369,39 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="deploy_object_detection_pytorch.html#sphx-glr-how-to-deploy-models-deploy-object-detection-pytorch-py"><span class="std std-ref">Compile PyTorch Object Detection Models</span></a> (<code class="docutils literal notranslate"><span class="pre">deploy_object_detection_pytorch.py</span></code>)</p></td>
-<td><p>03:30.916</p></td>
+<td><p>03:20.110</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="deploy_prequantized.html#sphx-glr-how-to-deploy-models-deploy-prequantized-py"><span class="std std-ref">Deploy a Framework-prequantized Model with TVM</span></a> (<code class="docutils literal notranslate"><span class="pre">deploy_prequantized.py</span></code>)</p></td>
-<td><p>01:27.347</p></td>
+<td><p>01:24.367</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="deploy_model_on_adreno.html#sphx-glr-how-to-deploy-models-deploy-model-on-adreno-py"><span class="std std-ref">Deploy the Pretrained Model on Adreno™</span></a> (<code class="docutils literal notranslate"><span class="pre">deploy_model_on_adreno.py</span></code>)</p></td>
-<td><p>01:17.688</p></td>
+<td><p>01:16.408</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="deploy_prequantized_tflite.html#sphx-glr-how-to-deploy-models-deploy-prequantized-tflite-py"><span class="std std-ref">Deploy a Framework-prequantized Model with TVM - Part 3 (TFLite)</span></a> (<code class="docutils literal notranslate"><span class="pre">deploy_prequantized_tflite.py</span></code>)</p></td>
-<td><p>00:51.214</p></td>
+<td><p>00:49.258</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="deploy_model_on_android.html#sphx-glr-how-to-deploy-models-deploy-model-on-android-py"><span class="std std-ref">Deploy the Pretrained Model on Android</span></a> (<code class="docutils literal notranslate"><span class="pre">deploy_model_on_android.py</span></code>)</p></td>
-<td><p>00:50.199</p></td>
+<td><p>00:48.145</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="deploy_model_on_adreno_tvmc.html#sphx-glr-how-to-deploy-models-deploy-model-on-adreno-tvmc-py"><span class="std std-ref">Deploy the Pretrained Model on Adreno™ with tvmc Interface</span></a> (<code class="docutils literal notranslate"><span class="pre">deploy_model_on_adreno_tvmc.py</span></code>)</p></td>
-<td><p>00:45.313</p></td>
+<td><p>00:44.474</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
-<tr class="row-odd"><td><p><a class="reference internal" href="deploy_model_on_nano.html#sphx-glr-how-to-deploy-models-deploy-model-on-nano-py"><span class="std std-ref">Deploy the Pretrained Model on Jetson Nano</span></a> (<code class="docutils literal notranslate"><span class="pre">deploy_model_on_nano.py</span></code>)</p></td>
-<td><p>00:29.302</p></td>
+<tr class="row-odd"><td><p><a class="reference internal" href="deploy_model_on_rasp.html#sphx-glr-how-to-deploy-models-deploy-model-on-rasp-py"><span class="std std-ref">Deploy the Pretrained Model on Raspberry Pi</span></a> (<code class="docutils literal notranslate"><span class="pre">deploy_model_on_rasp.py</span></code>)</p></td>
+<td><p>00:28.678</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
-<tr class="row-even"><td><p><a class="reference internal" href="deploy_model_on_rasp.html#sphx-glr-how-to-deploy-models-deploy-model-on-rasp-py"><span class="std std-ref">Deploy the Pretrained Model on Raspberry Pi</span></a> (<code class="docutils literal notranslate"><span class="pre">deploy_model_on_rasp.py</span></code>)</p></td>
-<td><p>00:29.183</p></td>
+<tr class="row-even"><td><p><a class="reference internal" href="deploy_model_on_nano.html#sphx-glr-how-to-deploy-models-deploy-model-on-nano-py"><span class="std std-ref">Deploy the Pretrained Model on Jetson Nano</span></a> (<code class="docutils literal notranslate"><span class="pre">deploy_model_on_nano.py</span></code>)</p></td>
+<td><p>00:28.351</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="deploy_sparse.html#sphx-glr-how-to-deploy-models-deploy-sparse-py"><span class="std std-ref">Deploy a Hugging Face Pruned Model on CPU</span></a> (<code class="docutils literal notranslate"><span class="pre">deploy_sparse.py</span></code>)</p></td>
-<td><p>00:00.011</p></td>
+<td><p>00:00.010</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 </tbody>
diff --git a/docs/how_to/extend_tvm/sg_execution_times.html b/docs/how_to/extend_tvm/sg_execution_times.html
index 038489245a..47058be2f7 100644
--- a/docs/how_to/extend_tvm/sg_execution_times.html
+++ b/docs/how_to/extend_tvm/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-how-to-extend-tvm-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>00:03.951</strong> total execution time for <strong>how_to_extend_tvm</strong> files:</p>
+<p><strong>00:03.781</strong> total execution time for <strong>how_to_extend_tvm</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 83%" />
@@ -369,11 +369,11 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="use_pass_instrument.html#sphx-glr-how-to-extend-tvm-use-pass-instrument-py"><span class="std std-ref">How to Use TVM Pass Instrument</span></a> (<code class="docutils literal notranslate"><span class="pre">use_pass_instrument.py</span></code>)</p></td>
-<td><p>00:02.781</p></td>
+<td><p>00:02.665</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="use_pass_infra.html#sphx-glr-how-to-extend-tvm-use-pass-infra-py"><span class="std std-ref">How to Use TVM Pass Infra</span></a> (<code class="docutils literal notranslate"><span class="pre">use_pass_infra.py</span></code>)</p></td>
-<td><p>00:01.163</p></td>
+<td><p>00:01.109</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="low_level_custom_pass.html#sphx-glr-how-to-extend-tvm-low-level-custom-pass-py"><span class="std std-ref">Writing a Customized Pass</span></a> (<code class="docutils literal notranslate"><span class="pre">low_level_custom_pass.py</span></code>)</p></td>
diff --git a/docs/how_to/extend_tvm/use_pass_instrument.html b/docs/how_to/extend_tvm/use_pass_instrument.html
index f33cc783c1..0cc8da255e 100644
--- a/docs/how_to/extend_tvm/use_pass_instrument.html
+++ b/docs/how_to/extend_tvm/use_pass_instrument.html
@@ -545,10 +545,10 @@ profile the execution time of each passes.</p>
 </pre></div>
 </div>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Printing results of timing profile...
-InferType: 25716us [25716us] (49.16%; 49.16%)
-FoldScaleAxis: 26597us [8us] (50.84%; 50.84%)
-        FoldConstant: 26588us [1795us] (50.83%; 99.97%)
-                InferType: 24793us [24793us] (47.39%; 93.25%)
+InferType: 24175us [24175us] (48.76%; 48.76%)
+FoldScaleAxis: 25403us [6us] (51.24%; 51.24%)
+        FoldConstant: 25396us [1739us] (51.22%; 99.97%)
+                InferType: 23657us [23657us] (47.72%; 93.15%)
 </pre></div>
 </div>
 </div>
@@ -570,10 +570,10 @@ Refer to following sections and <a class="reference internal" href="../../refere
 </pre></div>
 </div>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Printing results of timing profile...
-InferType: 24354us [24354us] (48.30%; 48.30%)
-FoldScaleAxis: 26071us [7us] (51.70%; 51.70%)
-        FoldConstant: 26064us [1818us] (51.69%; 99.97%)
-                InferType: 24246us [24246us] (48.08%; 93.02%)
+InferType: 23391us [23391us] (48.55%; 48.55%)
+FoldScaleAxis: 24783us [5us] (51.45%; 51.45%)
+        FoldConstant: 24778us [1648us] (51.43%; 99.98%)
+                InferType: 23130us [23130us] (48.01%; 93.35%)
 </pre></div>
 </div>
 <p>Register empty list to clear existing instruments.</p>
diff --git a/docs/how_to/optimize_operators/opt_conv_cuda.html b/docs/how_to/optimize_operators/opt_conv_cuda.html
index 7589bc04f1..557eb95371 100644
--- a/docs/how_to/optimize_operators/opt_conv_cuda.html
+++ b/docs/how_to/optimize_operators/opt_conv_cuda.html
@@ -595,7 +595,7 @@ latency of convolution.</p>
 <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;Convolution: </span><span class="si">%f</span><span class="s2"> ms&quot;</span> <span class="o">%</span> <span class="p">(</span><span class="n">evaluator</span><span class="p">(</span><span class="n">a</span><span class="p">,</span> <span class="n">w</span><span class="p">,</span> <span class="n">b</span><span class="p">)</span><span class="o">.</span><span class="n">mean</span> <span class="o">*</span> <span cl [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Convolution: 33.872543 ms
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Convolution: 35.691360 ms
 </pre></div>
 </div>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-optimize-operators-opt-conv-cuda-py">
diff --git a/docs/how_to/optimize_operators/opt_conv_tensorcore.html b/docs/how_to/optimize_operators/opt_conv_tensorcore.html
index 8f89aaa5ab..303fa516ee 100644
--- a/docs/how_to/optimize_operators/opt_conv_tensorcore.html
+++ b/docs/how_to/optimize_operators/opt_conv_tensorcore.html
@@ -877,7 +877,7 @@ be able to run on our build server</p>
     <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;conv2d with tensor core: </span><span class="si">%f</span><span class="s2"> ms&quot;</span> <span class="o">%</span> <span class="p">(</span><span class="n">evaluator</span><span class="p">(</span><span class="n">a</span><span class="p">,</span> <span class="n">w</span><span class="p">,</span> <span class="n">c</span><span class="p">)</span><span class="o">.</span><span class="n">mean</span> <span class="o">* [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>conv2d with tensor core: 12.265424 ms
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>conv2d with tensor core: 12.264919 ms
 </pre></div>
 </div>
 </div>
diff --git a/docs/how_to/optimize_operators/opt_gemm.html b/docs/how_to/optimize_operators/opt_gemm.html
index 9575b695f5..7bbf7073ea 100644
--- a/docs/how_to/optimize_operators/opt_gemm.html
+++ b/docs/how_to/optimize_operators/opt_gemm.html
@@ -492,8 +492,8 @@ Then we write a baseline implementation, the simplest way to write a matrix mult
 <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;Baseline: </span><span class="si">%f</span><span class="s2">&quot;</span> <span class="o">%</span> <span class="n">evaluator</span><span class="p">(</span><span class="n">a</span><span class="p">,</span> <span class="n">b</span><span class="p">,</span> <span class="n">c</span><span class="p">)</span><span class="o">.</span><span class="n">mean</span><span class="p">)</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Numpy running time: 0.016915
-Baseline: 3.385801
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Numpy running time: 0.013416
+Baseline: 3.391921
 </pre></div>
 </div>
 <p>In TVM, we can always inspect lower level IR to debug or optimize our schedule.
@@ -552,7 +552,7 @@ fill 32 * 32 * sizeof(float) which is 4KB in the cache whose total size is 32KB
 <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;Opt1: </span><span class="si">%f</span><span class="s2">&quot;</span> <span class="o">%</span> <span class="n">evaluator</span><span class="p">(</span><span class="n">a</span><span class="p">,</span> <span class="n">b</span><span class="p">,</span> <span class="n">c</span><span class="p">)</span><span class="o">.</span><span class="n">mean</span><span class="p">)</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Opt1: 0.300350
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Opt1: 0.291530
 </pre></div>
 </div>
 <p>Here is the generated IR after blocking.</p>
@@ -609,7 +609,7 @@ vastly.</p>
 <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;Opt2: </span><span class="si">%f</span><span class="s2">&quot;</span> <span class="o">%</span> <span class="n">evaluator</span><span class="p">(</span><span class="n">a</span><span class="p">,</span> <span class="n">b</span><span class="p">,</span> <span class="n">c</span><span class="p">)</span><span class="o">.</span><span class="n">mean</span><span class="p">)</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Opt2: 0.259783
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Opt2: 0.259740
 </pre></div>
 </div>
 <p>Here is the generated IR after vectorization.</p>
@@ -664,7 +664,7 @@ the access pattern for A matrix is more cache friendly.</p>
 <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;Opt3: </span><span class="si">%f</span><span class="s2">&quot;</span> <span class="o">%</span> <span class="n">evaluator</span><span class="p">(</span><span class="n">a</span><span class="p">,</span> <span class="n">b</span><span class="p">,</span> <span class="n">c</span><span class="p">)</span><span class="o">.</span><span class="n">mean</span><span class="p">)</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Opt3: 0.112768
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Opt3: 0.110507
 </pre></div>
 </div>
 <p>Here is the generated IR after loop permutation.</p>
@@ -741,7 +741,7 @@ flattening.</p>
 <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;Opt4: </span><span class="si">%f</span><span class="s2">&quot;</span> <span class="o">%</span> <span class="n">evaluator</span><span class="p">(</span><span class="n">a</span><span class="p">,</span> <span class="n">b</span><span class="p">,</span> <span class="n">c</span><span class="p">)</span><span class="o">.</span><span class="n">mean</span><span class="p">)</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Opt4: 0.103754
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Opt4: 0.103994
 </pre></div>
 </div>
 <p>Here is the generated IR after array packing.</p>
@@ -819,7 +819,7 @@ write to C when all the block results are ready.</p>
 <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;Opt5: </span><span class="si">%f</span><span class="s2">&quot;</span> <span class="o">%</span> <span class="n">evaluator</span><span class="p">(</span><span class="n">a</span><span class="p">,</span> <span class="n">b</span><span class="p">,</span> <span class="n">c</span><span class="p">)</span><span class="o">.</span><span class="n">mean</span><span class="p">)</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Opt5: 0.104403
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Opt5: 0.094832
 </pre></div>
 </div>
 <p>Here is the generated IR after blocking.</p>
@@ -899,7 +899,7 @@ class Module:
 <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;Opt6: </span><span class="si">%f</span><span class="s2">&quot;</span> <span class="o">%</span> <span class="n">opt6_time</span><span class="p">)</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Opt6: 0.123906
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Opt6: 0.111884
 </pre></div>
 </div>
 <p>Here is the generated IR after parallelization.</p>
diff --git a/docs/how_to/optimize_operators/sg_execution_times.html b/docs/how_to/optimize_operators/sg_execution_times.html
index 5c22cf53e0..802d53d26a 100644
--- a/docs/how_to/optimize_operators/sg_execution_times.html
+++ b/docs/how_to/optimize_operators/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-how-to-optimize-operators-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>00:33.003</strong> total execution time for <strong>how_to_optimize_operators</strong> files:</p>
+<p><strong>00:31.557</strong> total execution time for <strong>how_to_optimize_operators</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 83%" />
@@ -369,15 +369,15 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="opt_gemm.html#sphx-glr-how-to-optimize-operators-opt-gemm-py"><span class="std std-ref">How to optimize GEMM on CPU</span></a> (<code class="docutils literal notranslate"><span class="pre">opt_gemm.py</span></code>)</p></td>
-<td><p>00:29.838</p></td>
+<td><p>00:28.564</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="opt_conv_tensorcore.html#sphx-glr-how-to-optimize-operators-opt-conv-tensorcore-py"><span class="std std-ref">How to optimize convolution using TensorCores</span></a> (<code class="docutils literal notranslate"><span class="pre">opt_conv_tensorcore.py</span></code>)</p></td>
-<td><p>00:01.940</p></td>
+<td><p>00:01.846</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="opt_conv_cuda.html#sphx-glr-how-to-optimize-operators-opt-conv-cuda-py"><span class="std std-ref">How to optimize convolution on GPU</span></a> (<code class="docutils literal notranslate"><span class="pre">opt_conv_cuda.py</span></code>)</p></td>
-<td><p>00:01.225</p></td>
+<td><p>00:01.147</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 </tbody>
diff --git a/docs/how_to/tune_with_autoscheduler/sg_execution_times.html b/docs/how_to/tune_with_autoscheduler/sg_execution_times.html
index bfe3a1bf2a..f0392255bf 100644
--- a/docs/how_to/tune_with_autoscheduler/sg_execution_times.html
+++ b/docs/how_to/tune_with_autoscheduler/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-how-to-tune-with-autoscheduler-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>03:33.708</strong> total execution time for <strong>how_to_tune_with_autoscheduler</strong> files:</p>
+<p><strong>03:25.824</strong> total execution time for <strong>how_to_tune_with_autoscheduler</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 85%" />
@@ -369,23 +369,23 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="tune_network_x86.html#sphx-glr-how-to-tune-with-autoscheduler-tune-network-x86-py"><span class="std std-ref">Auto-scheduling a Neural Network for x86 CPU</span></a> (<code class="docutils literal notranslate"><span class="pre">tune_network_x86.py</span></code>)</p></td>
-<td><p>01:32.074</p></td>
+<td><p>01:28.951</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="tune_network_cuda.html#sphx-glr-how-to-tune-with-autoscheduler-tune-network-cuda-py"><span class="std std-ref">Auto-scheduling a Neural Network for NVIDIA GPU</span></a> (<code class="docutils literal notranslate"><span class="pre">tune_network_cuda.py</span></code>)</p></td>
-<td><p>01:11.910</p></td>
+<td><p>01:09.182</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="tune_network_arm.html#sphx-glr-how-to-tune-with-autoscheduler-tune-network-arm-py"><span class="std std-ref">Auto-scheduling a Neural Network for ARM CPU</span></a> (<code class="docutils literal notranslate"><span class="pre">tune_network_arm.py</span></code>)</p></td>
-<td><p>00:17.441</p></td>
+<td><p>00:16.840</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="tune_network_mali.html#sphx-glr-how-to-tune-with-autoscheduler-tune-network-mali-py"><span class="std std-ref">Auto-scheduling a Neural Network for mali GPU</span></a> (<code class="docutils literal notranslate"><span class="pre">tune_network_mali.py</span></code>)</p></td>
-<td><p>00:16.124</p></td>
+<td><p>00:15.610</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="tune_conv2d_layer_cuda.html#sphx-glr-how-to-tune-with-autoscheduler-tune-conv2d-layer-cuda-py"><span class="std std-ref">Auto-scheduling a Convolution Layer for GPU</span></a> (<code class="docutils literal notranslate"><span class="pre">tune_conv2d_layer_cuda.py</span></code>)</p></td>
-<td><p>00:16.062</p></td>
+<td><p>00:15.144</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="tune_sparse_x86.html#sphx-glr-how-to-tune-with-autoscheduler-tune-sparse-x86-py"><span class="std std-ref">Auto-scheduling Sparse Matrix Multiplication on CPU with Custom Sketch Rule</span></a> (<code class="docutils literal notranslate"><span class="pre">tune_sparse_x86.py</span></code>)</p></td>
diff --git a/docs/how_to/tune_with_autoscheduler/tune_conv2d_layer_cuda.html b/docs/how_to/tune_with_autoscheduler/tune_conv2d_layer_cuda.html
index e604733d55..65f728d90e 100644
--- a/docs/how_to/tune_with_autoscheduler/tune_conv2d_layer_cuda.html
+++ b/docs/how_to/tune_with_autoscheduler/tune_conv2d_layer_cuda.html
@@ -1032,7 +1032,7 @@ class Module:
 <span class="p">)</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Execution time of this operator: 0.344 ms
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Execution time of this operator: 0.345 ms
 </pre></div>
 </div>
 </div>
diff --git a/docs/how_to/tune_with_autoscheduler/tune_network_cuda.html b/docs/how_to/tune_with_autoscheduler/tune_network_cuda.html
index a610cad8f6..8a6f4a0b51 100644
--- a/docs/how_to/tune_with_autoscheduler/tune_network_cuda.html
+++ b/docs/how_to/tune_with_autoscheduler/tune_network_cuda.html
@@ -923,7 +923,7 @@ so we can read the log file and load the best schedules.</p>
 Evaluate inference time cost...
 Execution time summary:
  mean (ms)   median (ms)    max (ms)     min (ms)     std (ms)
-   3.2051       3.2046       3.2064       3.2044       0.0009
+   3.2105       3.2106       3.2107       3.2103       0.0002
 </pre></div>
 </div>
 </div>
@@ -945,7 +945,7 @@ to learn how to use the RPC Tracker and RPC Server.
 To use the RPC Tracker in auto-scheduler, replace the runner in <code class="code docutils literal notranslate"><span class="pre">TuningOptions</span></code>
 with <a class="reference internal" href="../../reference/api/python/auto_scheduler.html#tvm.auto_scheduler.RPCRunner" title="tvm.auto_scheduler.RPCRunner"><code class="xref any py py-class docutils literal notranslate"><span class="pre">auto_scheduler.RPCRunner</span></code></a>.</p></li>
 </ol>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  11.910 seconds)</p>
+<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  9.182 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-tune-with-autoscheduler-tune-network-cuda-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../../_downloads/eafe360d52540634c9eea0fa89e804bd/tune_network_cuda.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">tune_network_cuda.py</span></code></a></p>
diff --git a/docs/how_to/tune_with_autoscheduler/tune_network_x86.html b/docs/how_to/tune_with_autoscheduler/tune_network_x86.html
index 3b59a28ac3..9ad2c19059 100644
--- a/docs/how_to/tune_with_autoscheduler/tune_network_x86.html
+++ b/docs/how_to/tune_with_autoscheduler/tune_network_x86.html
@@ -945,7 +945,7 @@ so we can read the log file and load the best schedules.</p>
 Evaluate inference time cost...
 Execution time summary:
  mean (ms)   median (ms)    max (ms)     min (ms)     std (ms)
-  734.4810     734.0194     736.5202     732.9033      1.5122
+  701.7361     700.7611     704.0504     700.3969      1.6431
 </pre></div>
 </div>
 </div>
@@ -967,7 +967,7 @@ to learn how to use the RPC Tracker and RPC Server.
 To use the RPC Tracker in auto-scheduler, replace the runner in <code class="code docutils literal notranslate"><span class="pre">TuningOptions</span></code>
 with <a class="reference internal" href="../../reference/api/python/auto_scheduler.html#tvm.auto_scheduler.RPCRunner" title="tvm.auto_scheduler.RPCRunner"><code class="xref any py py-class docutils literal notranslate"><span class="pre">auto_scheduler.RPCRunner</span></code></a>.</p></li>
 </ol>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  32.074 seconds)</p>
+<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  28.951 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-tune-with-autoscheduler-tune-network-x86-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../../_downloads/e416b94ca1090b0897c0f6e0df95b911/tune_network_x86.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">tune_network_x86.py</span></code></a></p>
diff --git a/docs/how_to/tune_with_autotvm/sg_execution_times.html b/docs/how_to/tune_with_autotvm/sg_execution_times.html
index 1b7e726249..d03b369733 100644
--- a/docs/how_to/tune_with_autotvm/sg_execution_times.html
+++ b/docs/how_to/tune_with_autotvm/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-how-to-tune-with-autotvm-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>00:23.486</strong> total execution time for <strong>how_to_tune_with_autotvm</strong> files:</p>
+<p><strong>00:22.820</strong> total execution time for <strong>how_to_tune_with_autotvm</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 84%" />
@@ -369,7 +369,7 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="tune_conv2d_cuda.html#sphx-glr-how-to-tune-with-autotvm-tune-conv2d-cuda-py"><span class="std std-ref">Tuning High Performance Convolution on NVIDIA GPUs</span></a> (<code class="docutils literal notranslate"><span class="pre">tune_conv2d_cuda.py</span></code>)</p></td>
-<td><p>00:23.450</p></td>
+<td><p>00:22.784</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="tune_relay_x86.html#sphx-glr-how-to-tune-with-autotvm-tune-relay-x86-py"><span class="std std-ref">Auto-tuning a Convolutional Network for x86 CPU</span></a> (<code class="docutils literal notranslate"><span class="pre">tune_relay_x86.py</span></code>)</p></td>
diff --git a/docs/how_to/tune_with_autotvm/tune_conv2d_cuda.html b/docs/how_to/tune_with_autotvm/tune_conv2d_cuda.html
index dd4c52a44d..203983053f 100644
--- a/docs/how_to/tune_with_autotvm/tune_conv2d_cuda.html
+++ b/docs/how_to/tune_with_autotvm/tune_conv2d_cuda.html
@@ -630,7 +630,7 @@ and measure running time.</p>
 
 Best config:
 ,None
-Time cost of this operator: 0.037160
+Time cost of this operator: 0.037194
 </pre></div>
 </div>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-tune-with-autotvm-tune-conv2d-cuda-py">
diff --git a/docs/how_to/work_with_microtvm/micro_autotune.html b/docs/how_to/work_with_microtvm/micro_autotune.html
index acf4ecff81..bddd59d8c7 100644
--- a/docs/how_to/work_with_microtvm/micro_autotune.html
+++ b/docs/how_to/work_with_microtvm/micro_autotune.html
@@ -664,10 +664,10 @@ the tuned operator.</p>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>########## Build without Autotuning ##########
 Node Name                                     Ops                                           Time(us)  Time(%)  Shape              Inputs  Outputs  Measurements(us)
 ---------                                     ---                                           --------  -------  -----              ------  -------  ----------------
-tvmgen_default_fused_nn_contrib_conv2d_NCHWc  tvmgen_default_fused_nn_contrib_conv2d_NCHWc  296.0     98.705   (1, 2, 10, 10, 3)  2       1        [296.0]
-tvmgen_default_fused_layout_transform_1       tvmgen_default_fused_layout_transform_1       2.941     0.981    (1, 6, 10, 10)     1       1        [2.941]
-tvmgen_default_fused_layout_transform         tvmgen_default_fused_layout_transform         0.943     0.315    (1, 1, 10, 10, 3)  1       1        [0.943]
-Total_time                                    -                                             299.885   -        -                  -       -        -
+tvmgen_default_fused_nn_contrib_conv2d_NCHWc  tvmgen_default_fused_nn_contrib_conv2d_NCHWc  305.1     98.768   (1, 2, 10, 10, 3)  2       1        [305.1]
+tvmgen_default_fused_layout_transform_1       tvmgen_default_fused_layout_transform_1       2.868     0.928    (1, 6, 10, 10)     1       1        [2.868]
+tvmgen_default_fused_layout_transform         tvmgen_default_fused_layout_transform         0.937     0.303    (1, 1, 10, 10, 3)  1       1        [0.937]
+Total_time                                    -                                             308.905   -        -                  -       -        -
 </pre></div>
 </div>
 </div>
@@ -719,13 +719,13 @@ Total_time                                    -
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>########## Build with Autotuning ##########
 Node Name                                     Ops                                           Time(us)  Time(%)  Shape              Inputs  Outputs  Measurements(us)
 ---------                                     ---                                           --------  -------  -----              ------  -------  ----------------
-tvmgen_default_fused_nn_contrib_conv2d_NCHWc  tvmgen_default_fused_nn_contrib_conv2d_NCHWc  130.9     98.01    (1, 6, 10, 10, 1)  2       1        [130.9]
-tvmgen_default_fused_layout_transform_1       tvmgen_default_fused_layout_transform_1       1.714     1.283    (1, 6, 10, 10)     1       1        [1.714]
-tvmgen_default_fused_layout_transform         tvmgen_default_fused_layout_transform         0.943     0.706    (1, 1, 10, 10, 3)  1       1        [0.943]
-Total_time                                    -                                             133.557   -        -                  -       -        -
+tvmgen_default_fused_nn_contrib_conv2d_NCHWc  tvmgen_default_fused_nn_contrib_conv2d_NCHWc  129.8     98.006   (1, 6, 10, 10, 1)  2       1        [129.8]
+tvmgen_default_fused_layout_transform_1       tvmgen_default_fused_layout_transform_1       1.717     1.297    (1, 6, 10, 10)     1       1        [1.717]
+tvmgen_default_fused_layout_transform         tvmgen_default_fused_layout_transform         0.924     0.697    (1, 1, 10, 10, 3)  1       1        [0.924]
+Total_time                                    -                                             132.441   -        -                  -       -        -
 </pre></div>
 </div>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  24.970 seconds)</p>
+<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  22.628 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-work-with-microtvm-micro-autotune-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../../_downloads/9ccca8fd489a1486ac71b55a55c320c5/micro_autotune.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">micro_autotune.py</span></code></a></p>
diff --git a/docs/how_to/work_with_microtvm/micro_pytorch.html b/docs/how_to/work_with_microtvm/micro_pytorch.html
index 39998ff308..6fd1863045 100644
--- a/docs/how_to/work_with_microtvm/micro_pytorch.html
+++ b/docs/how_to/work_with_microtvm/micro_pytorch.html
@@ -475,8 +475,8 @@ download a cat image and preprocess it to use as the model input.</p>
 Downloading: &quot;https://download.pytorch.org/models/quantized/mobilenet_v2_qnnpack_37f702c5.pth&quot; to /workspace/.cache/torch/hub/checkpoints/mobilenet_v2_qnnpack_37f702c5.pth
 
   0%|          | 0.00/3.42M [00:00&lt;?, ?B/s]
- 61%|######    | 2.09M/3.42M [00:00&lt;00:00, 15.8MB/s]
-100%|##########| 3.42M/3.42M [00:00&lt;00:00, 25.0MB/s]
+ 61%|######    | 2.09M/3.42M [00:00&lt;00:00, 11.8MB/s]
+100%|##########| 3.42M/3.42M [00:00&lt;00:00, 18.3MB/s]
 /venv/apache-tvm-py3.8/lib/python3.8/site-packages/torch/_utils.py:314: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly.  To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
   device=storage.device,
 /workspace/python/tvm/relay/frontend/pytorch_utils.py:47: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
@@ -604,7 +604,7 @@ via the host <cite>main.cc`</cite> or if a Zephyr emulated board is selected as
 Torch top-1 id: 282, class name: tiger cat
 </pre></div>
 </div>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  27.905 seconds)</p>
+<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  24.486 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-work-with-microtvm-micro-pytorch-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../../_downloads/12b9ecc04c41abaa12022061771821d1/micro_pytorch.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">micro_pytorch.py</span></code></a></p>
diff --git a/docs/how_to/work_with_microtvm/micro_train.html b/docs/how_to/work_with_microtvm/micro_train.html
index cbd4324495..6ea4b16723 100644
--- a/docs/how_to/work_with_microtvm/micro_train.html
+++ b/docs/how_to/work_with_microtvm/micro_train.html
@@ -543,7 +543,7 @@ take about <strong>2 minutes</strong> to download the Stanford Cars, while COCO
 <a href="https://docs.python.org/3/library/shutil.html#shutil.move" title="shutil.move" class="sphx-glr-backref-module-shutil sphx-glr-backref-type-py-function"><span class="n">shutil</span><span class="o">.</span><span class="n">move</span></a><span class="p">(</span><span class="sa">f</span><span class="s2">&quot;</span><span class="si">{</span><a href="https://docs.python.org/3/library/stdtypes.html#str" title="builtins.str" class="sphx-glr-backref-module-builtins sphx-glr-backref-typ [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>&#39;/tmp/tmpmvn13rdm/images/random&#39;
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>&#39;/tmp/tmpvp040j6r/images/random&#39;
 </pre></div>
 </div>
 </div>
@@ -603,8 +603,8 @@ objects to other stuff? We can display some examples from our datasets using <co
     <span class="n">plt</span><span class="o">.</span><span class="n">axis</span><span class="p">(</span><span class="s2">&quot;off&quot;</span><span class="p">)</span>
 </pre></div>
 </div>
-<img src="../../_images/sphx_glr_micro_train_001.png" srcset="../../_images/sphx_glr_micro_train_001.png" alt="[1.0, 0.0], [1.0, 0.0], [1.0, 0.0], [0.0, 1.0], [0.0, 1.0], [0.0, 1.0], [0.0, 1.0], [0.0, 1.0], [1.0, 0.0], [0.0, 1.0]" class = "sphx-glr-single-img"/><div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>/tmp/tmpmvn13rdm/images/target contains 8144 images
-/tmp/tmpmvn13rdm/images/random contains 5000 images
+<img src="../../_images/sphx_glr_micro_train_001.png" srcset="../../_images/sphx_glr_micro_train_001.png" alt="[1.0, 0.0], [1.0, 0.0], [1.0, 0.0], [0.0, 1.0], [0.0, 1.0], [0.0, 1.0], [0.0, 1.0], [0.0, 1.0], [1.0, 0.0], [0.0, 1.0]" class = "sphx-glr-single-img"/><div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>/tmp/tmpvp040j6r/images/target contains 8144 images
+/tmp/tmpvp040j6r/images/random contains 5000 images
 </pre></div>
 </div>
 </div>
@@ -716,13 +716,13 @@ the time on our validation set).</p>
 </pre></div>
 </div>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Epoch 1/3
-328/328 - 38s - loss: 0.2239 - accuracy: 0.9251 - val_loss: 0.1174 - val_accuracy: 0.9611 - 38s/epoch - 116ms/step
+328/328 - 38s - loss: 0.2149 - accuracy: 0.9268 - val_loss: 0.1321 - val_accuracy: 0.9528 - 38s/epoch - 117ms/step
 Epoch 2/3
-328/328 - 34s - loss: 0.1036 - accuracy: 0.9608 - val_loss: 0.1293 - val_accuracy: 0.9468 - 34s/epoch - 105ms/step
+328/328 - 34s - loss: 0.1025 - accuracy: 0.9633 - val_loss: 0.1075 - val_accuracy: 0.9660 - 34s/epoch - 104ms/step
 Epoch 3/3
-328/328 - 34s - loss: 0.0704 - accuracy: 0.9746 - val_loss: 0.1212 - val_accuracy: 0.9603 - 34s/epoch - 105ms/step
+328/328 - 34s - loss: 0.0693 - accuracy: 0.9735 - val_loss: 0.1036 - val_accuracy: 0.9671 - 34s/epoch - 104ms/step
 
-&lt;keras.callbacks.History object at 0x7f046de62760&gt;
+&lt;keras.callbacks.History object at 0x7f8f73b4a8e0&gt;
 </pre></div>
 </div>
 </div>
@@ -986,7 +986,7 @@ as intended.</p>
 <p>From here, we could modify the model to read live images from the camera - we have another
 Arduino tutorial for how to do that <a class="reference external" href="https://github.com/guberti/tvm-arduino-demos/tree/master/examples/person_detection">on GitHub</a>. Alternatively, we could also
 <a class="reference external" href="https://tvm.apache.org/docs/how_to/work_with_microtvm/micro_autotune.html">use TVM’s autotuning capabilities</a> to dramatically improve the model’s performance.</p>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 4 minutes  48.012 seconds)</p>
+<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 4 minutes  45.515 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-how-to-work-with-microtvm-micro-train-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../../_downloads/b52cec46baf4f78d6bcd94cbe269c8a6/micro_train.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">micro_train.py</span></code></a></p>
diff --git a/docs/how_to/work_with_microtvm/sg_execution_times.html b/docs/how_to/work_with_microtvm/sg_execution_times.html
index 5d0383d26a..60b08cf09e 100644
--- a/docs/how_to/work_with_microtvm/sg_execution_times.html
+++ b/docs/how_to/work_with_microtvm/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-how-to-work-with-microtvm-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>08:09.140</strong> total execution time for <strong>how_to_work_with_microtvm</strong> files:</p>
+<p><strong>08:00.439</strong> total execution time for <strong>how_to_work_with_microtvm</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 82%" />
@@ -369,34 +369,34 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="micro_train.html#sphx-glr-how-to-work-with-microtvm-micro-train-py"><span class="std std-ref">5. Training Vision Models for microTVM on Arduino</span></a> (<code class="docutils literal notranslate"><span class="pre">micro_train.py</span></code>)</p></td>
-<td><p>04:48.012</p></td>
+<td><p>04:45.515</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="micro_pytorch.html#sphx-glr-how-to-work-with-microtvm-micro-pytorch-py"><span class="std std-ref">4. microTVM PyTorch Tutorial</span></a> (<code class="docutils literal notranslate"><span class="pre">micro_pytorch.py</span></code>)</p></td>
-<td><p>01:27.905</p></td>
+<td><p>01:24.486</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="micro_autotune.html#sphx-glr-how-to-work-with-microtvm-micro-autotune-py"><span class="std std-ref">6. Model Tuning with microTVM</span></a> (<code class="docutils literal notranslate"><span class="pre">micro_autotune.py</span></code>)</p></td>
-<td><p>01:24.970</p></td>
+<td><p>01:22.628</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="micro_aot.html#sphx-glr-how-to-work-with-microtvm-micro-aot-py"><span class="std std-ref">3. microTVM Ahead-of-Time (AOT) Compilation</span></a> (<code class="docutils literal notranslate"><span class="pre">micro_aot.py</span></code>)</p></td>
-<td><p>00:11.857</p></td>
+<td><p>00:11.454</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="micro_custom_ide.html#sphx-glr-how-to-work-with-microtvm-micro-custom-ide-py"><span class="std std-ref">9. Bring microTVM to your own development environment</span></a> (<code class="docutils literal notranslate"><span class="pre">micro_custom_ide.py</span></code>)</p></td>
-<td><p>00:08.907</p></td>
+<td><p>00:08.370</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="micro_tflite.html#sphx-glr-how-to-work-with-microtvm-micro-tflite-py"><span class="std std-ref">2. microTVM TFLite Tutorial</span></a> (<code class="docutils literal notranslate"><span class="pre">micro_tflite.py</span></code>)</p></td>
-<td><p>00:07.490</p></td>
+<td><p>00:07.987</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
-<tr class="row-odd"><td><p><a class="reference internal" href="micro_tvmc.html#sphx-glr-how-to-work-with-microtvm-micro-tvmc-py"><span class="std std-ref">1. microTVM CLI Tool</span></a> (<code class="docutils literal notranslate"><span class="pre">micro_tvmc.py</span></code>)</p></td>
+<tr class="row-odd"><td><p><a class="reference internal" href="micro_ethosu.html#sphx-glr-how-to-work-with-microtvm-micro-ethosu-py"><span class="std std-ref">7. Running TVM on bare metal Arm(R) Cortex(R)-M55 CPU and Ethos(TM)-U55 NPU with CMSIS-NN</span></a> (<code class="docutils literal notranslate"><span class="pre">micro_ethosu.py</span></code>)</p></td>
 <td><p>00:00.000</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
-<tr class="row-even"><td><p><a class="reference internal" href="micro_ethosu.html#sphx-glr-how-to-work-with-microtvm-micro-ethosu-py"><span class="std std-ref">7. Running TVM on bare metal Arm(R) Cortex(R)-M55 CPU and Ethos(TM)-U55 NPU with CMSIS-NN</span></a> (<code class="docutils literal notranslate"><span class="pre">micro_ethosu.py</span></code>)</p></td>
+<tr class="row-even"><td><p><a class="reference internal" href="micro_tvmc.html#sphx-glr-how-to-work-with-microtvm-micro-tvmc-py"><span class="std std-ref">1. microTVM CLI Tool</span></a> (<code class="docutils literal notranslate"><span class="pre">micro_tvmc.py</span></code>)</p></td>
 <td><p>00:00.000</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
diff --git a/docs/how_to/work_with_relay/sg_execution_times.html b/docs/how_to/work_with_relay/sg_execution_times.html
index d50f29f3f9..0fb8b1b77e 100644
--- a/docs/how_to/work_with_relay/sg_execution_times.html
+++ b/docs/how_to/work_with_relay/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-how-to-work-with-relay-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>00:38.098</strong> total execution time for <strong>how_to_work_with_relay</strong> files:</p>
+<p><strong>00:37.291</strong> total execution time for <strong>how_to_work_with_relay</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 84%" />
@@ -369,15 +369,15 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="using_pipeline_executor.html#sphx-glr-how-to-work-with-relay-using-pipeline-executor-py"><span class="std std-ref">Using Pipeline Executor in Relay</span></a> (<code class="docutils literal notranslate"><span class="pre">using_pipeline_executor.py</span></code>)</p></td>
-<td><p>00:32.948</p></td>
+<td><p>00:32.255</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="using_external_lib.html#sphx-glr-how-to-work-with-relay-using-external-lib-py"><span class="std std-ref">Using External Libraries in Relay</span></a> (<code class="docutils literal notranslate"><span class="pre">using_external_lib.py</span></code>)</p></td>
-<td><p>00:03.254</p></td>
+<td><p>00:03.137</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="build_gcn.html#sphx-glr-how-to-work-with-relay-build-gcn-py"><span class="std std-ref">Building a Graph Convolutional Network</span></a> (<code class="docutils literal notranslate"><span class="pre">build_gcn.py</span></code>)</p></td>
-<td><p>00:01.890</p></td>
+<td><p>00:01.892</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="using_relay_viz.html#sphx-glr-how-to-work-with-relay-using-relay-viz-py"><span class="std std-ref">Use Relay Visualizer to Visualize Relay</span></a> (<code class="docutils literal notranslate"><span class="pre">using_relay_viz.py</span></code>)</p></td>
diff --git a/docs/how_to/work_with_schedules/intrin_math.html b/docs/how_to/work_with_schedules/intrin_math.html
index 0d4fea6e12..aa050f28b7 100644
--- a/docs/how_to/work_with_schedules/intrin_math.html
+++ b/docs/how_to/work_with_schedules/intrin_math.html
@@ -572,7 +572,7 @@ The following example customizes CUDA lowering rule for <code class="code docuti
 <a href="../../reference/api/python/ir.html#tvm.ir.register_intrin_lowering" title="tvm.ir.register_intrin_lowering" class="sphx-glr-backref-module-tvm-ir sphx-glr-backref-type-py-function"><span class="n">register_intrin_lowering</span></a><span class="p">(</span><span class="s2">&quot;tir.exp&quot;</span><span class="p">,</span> <span class="n">target</span><span class="o">=</span><span class="s2">&quot;cuda&quot;</span><span class="p">,</span> <span class="n">f</span><span class="o">= [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>&lt;function my_cuda_math_rule at 0x7f064815eee0&gt;
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>&lt;function my_cuda_math_rule at 0x7f914a1abdc0&gt;
 </pre></div>
 </div>
 <p>Register the rule to TVM with override option to override existing rule.
diff --git a/docs/how_to/work_with_schedules/sg_execution_times.html b/docs/how_to/work_with_schedules/sg_execution_times.html
index ca65ea0ae1..82310a0b72 100644
--- a/docs/how_to/work_with_schedules/sg_execution_times.html
+++ b/docs/how_to/work_with_schedules/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-how-to-work-with-schedules-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>00:05.432</strong> total execution time for <strong>how_to_work_with_schedules</strong> files:</p>
+<p><strong>00:05.466</strong> total execution time for <strong>how_to_work_with_schedules</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 83%" />
@@ -369,35 +369,35 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="intrin_math.html#sphx-glr-how-to-work-with-schedules-intrin-math-py"><span class="std std-ref">Intrinsics and Math Functions</span></a> (<code class="docutils literal notranslate"><span class="pre">intrin_math.py</span></code>)</p></td>
-<td><p>00:02.496</p></td>
+<td><p>00:02.525</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="tensorize.html#sphx-glr-how-to-work-with-schedules-tensorize-py"><span class="std std-ref">Use Tensorize to Leverage Hardware Intrinsics</span></a> (<code class="docutils literal notranslate"><span class="pre">tensorize.py</span></code>)</p></td>
-<td><p>00:01.246</p></td>
+<td><p>00:01.241</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="reduction.html#sphx-glr-how-to-work-with-schedules-reduction-py"><span class="std std-ref">Reduction</span></a> (<code class="docutils literal notranslate"><span class="pre">reduction.py</span></code>)</p></td>
-<td><p>00:00.709</p></td>
+<td><p>00:00.724</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="scan.html#sphx-glr-how-to-work-with-schedules-scan-py"><span class="std std-ref">Scan and Recurrent Kernel</span></a> (<code class="docutils literal notranslate"><span class="pre">scan.py</span></code>)</p></td>
-<td><p>00:00.698</p></td>
+<td><p>00:00.700</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="extern_op.html#sphx-glr-how-to-work-with-schedules-extern-op-py"><span class="std std-ref">External Tensor Functions</span></a> (<code class="docutils literal notranslate"><span class="pre">extern_op.py</span></code>)</p></td>
-<td><p>00:00.118</p></td>
+<td><p>00:00.114</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="schedule_primitives.html#sphx-glr-how-to-work-with-schedules-schedule-primitives-py"><span class="std std-ref">Schedule Primitives in TVM</span></a> (<code class="docutils literal notranslate"><span class="pre">schedule_primitives.py</span></code>)</p></td>
-<td><p>00:00.069</p></td>
+<td><p>00:00.068</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="tedd.html#sphx-glr-how-to-work-with-schedules-tedd-py"><span class="std std-ref">Use Tensor Expression Debug Display (TEDD) for Visualization</span></a> (<code class="docutils literal notranslate"><span class="pre">tedd.py</span></code>)</p></td>
-<td><p>00:00.066</p></td>
+<td><p>00:00.064</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="tuple_inputs.html#sphx-glr-how-to-work-with-schedules-tuple-inputs-py"><span class="std std-ref">Compute and Reduce with Tuple Inputs</span></a> (<code class="docutils literal notranslate"><span class="pre">tuple_inputs.py</span></code>)</p></td>
-<td><p>00:00.031</p></td>
+<td><p>00:00.030</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 </tbody>
diff --git a/docs/objects.inv b/docs/objects.inv
index 26e3373181..a2c45b1dc2 100644
Binary files a/docs/objects.inv and b/docs/objects.inv differ
diff --git a/docs/reference/api/python/auto_scheduler.html b/docs/reference/api/python/auto_scheduler.html
index ab806c0317..be75fbaa66 100644
--- a/docs/reference/api/python/auto_scheduler.html
+++ b/docs/reference/api/python/auto_scheduler.html
@@ -1637,7 +1637,7 @@ history states as starting point to perform Evolutionary Search).</p></li>
 
 <dl class="py class">
 <dt class="sig sig-object py" id="tvm.auto_scheduler.SketchPolicy">
-<em class="property"><span class="pre">class</span> </em><span class="sig-prename descclassname"><span class="pre">tvm.auto_scheduler.</span></span><span class="sig-name descname"><span class="pre">SketchPolicy</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">task</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">program_cost_model</span></span><span class="o"><span class="pre">=</span></span><span class="defau [...]
+<em class="property"><span class="pre">class</span> </em><span class="sig-prename descclassname"><span class="pre">tvm.auto_scheduler.</span></span><span class="sig-name descname"><span class="pre">SketchPolicy</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">task</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">program_cost_model</span></span><span class="o"><span class="pre">=</span></span><span class="defau [...]
 <dd><p>The search policy that searches in a hierarchical search space defined by sketches.
 The policy randomly samples programs from the space defined by sketches and use evolutionary
 search to fine-tune them.</p>
@@ -1921,7 +1921,7 @@ Candidates:
 
 <dl class="py function">
 <dt class="sig sig-object py" id="tvm.auto_scheduler.auto_schedule">
-<span class="sig-prename descclassname"><span class="pre">tvm.auto_scheduler.</span></span><span class="sig-name descname"><span class="pre">auto_schedule</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">task</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">search_policy</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">None</span></span></em>, <em clas [...]
+<span class="sig-prename descclassname"><span class="pre">tvm.auto_scheduler.</span></span><span class="sig-name descname"><span class="pre">auto_schedule</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">task</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">search_policy</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">None</span></span></em>, <em clas [...]
 <dd><p>THIS API IS DEPRECATED.</p>
 <p>Run auto scheduling search for a task.</p>
 <dl class="field-list simple">
diff --git a/docs/reference/api/typedoc/classes/ArtifactCache.html b/docs/reference/api/typedoc/classes/ArtifactCache.html
index 8321277f33..3d26cdfc5d 100644
--- a/docs/reference/api/typedoc/classes/ArtifactCache.html
+++ b/docs/reference/api/typedoc/classes/ArtifactCache.html
@@ -23,7 +23,7 @@
 <ul class="tsd-hierarchy">
 <li><span class="target">ArtifactCache</span></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L991">runtime.ts:991</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L991">runtime.ts:991</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -52,7 +52,7 @@
 <h5><span class="tsd-kind-parameter">scope</span>: <span class="tsd-signature-type">string</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="ArtifactCache.html" class="tsd-signature-type tsd-kind-class">ArtifactCache</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L995">runtime.ts:995</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L995">runtime.ts:995</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="fetchWithCache" class="tsd-anchor"></a>
@@ -67,7 +67,7 @@
 <h5><span class="tsd-kind-parameter">url</span>: <span class="tsd-signature-type">string</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Promise</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type ">Response</span><span class="tsd-signature-symbol">&gt;</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L999">runtime.ts:999</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L999">runtime.ts:999</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="hasAllKeys" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>has<wbr/>All<wbr/>Keys</span><a href="#hasAllKeys" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -80,7 +80,7 @@
 <h5><span class="tsd-kind-parameter">keys</span>: <span class="tsd-signature-type">string</span><span class="tsd-signature-symbol">[]</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Promise</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">boolean</span><span class="tsd-signature-symbol">&gt;</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1015">runtime.ts:1015</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1015">runtime.ts:1015</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/DLDataType.html b/docs/reference/api/typedoc/classes/DLDataType.html
index db91386111..ad9d3c6301 100644
--- a/docs/reference/api/typedoc/classes/DLDataType.html
+++ b/docs/reference/api/typedoc/classes/DLDataType.html
@@ -23,7 +23,7 @@
 <ul class="tsd-hierarchy">
 <li><span class="target">DLDataType</span></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L401">runtime.ts:401</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L401">runtime.ts:401</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -62,7 +62,7 @@
 <h5><span class="tsd-kind-parameter">lanes</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="DLDataType.html" class="tsd-signature-type tsd-kind-class">DLDataType</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L409">runtime.ts:409</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L409">runtime.ts:409</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Properties</h2>
 <section class="tsd-panel tsd-member"><a id="bits" class="tsd-anchor"></a>
@@ -71,21 +71,21 @@
 <div class="tsd-comment tsd-typography"><p>Number of bits in the data type.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L405">runtime.ts:405</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L405">runtime.ts:405</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="code" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>code</span><a href="#code" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">code</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span></div>
 <div class="tsd-comment tsd-typography"><p>The type code</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L403">runtime.ts:403</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L403">runtime.ts:403</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="lanes" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>lanes</span><a href="#lanes" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">lanes</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span></div>
 <div class="tsd-comment tsd-typography"><p>Number of vector lanes.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L407">runtime.ts:407</a></li></ul></aside></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L407">runtime.ts:407</a></li></ul></aside></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="numStorageBytes" class="tsd-anchor"></a>
@@ -95,7 +95,7 @@
 <li class="tsd-description">
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L424">runtime.ts:424</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L424">runtime.ts:424</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="toString" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>to<wbr/>String</span><a href="#toString" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -103,7 +103,7 @@
 <li class="tsd-description">
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">string</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L415">runtime.ts:415</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L415">runtime.ts:415</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/DLDevice.html b/docs/reference/api/typedoc/classes/DLDevice.html
index 90b50fde19..332c1189ab 100644
--- a/docs/reference/api/typedoc/classes/DLDevice.html
+++ b/docs/reference/api/typedoc/classes/DLDevice.html
@@ -23,7 +23,7 @@
 <ul class="tsd-hierarchy">
 <li><span class="target">DLDevice</span></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L341">runtime.ts:341</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L341">runtime.ts:341</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -61,7 +61,7 @@
 <h5><span class="tsd-kind-parameter">lib</span>: <a href="_internal_.FFILibrary.html" class="tsd-signature-type tsd-kind-class">FFILibrary</a></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="DLDevice.html" class="tsd-signature-type tsd-kind-class">DLDevice</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L349">runtime.ts:349</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L349">runtime.ts:349</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Properties</h2>
 <section class="tsd-panel tsd-member"><a id="deviceId" class="tsd-anchor"></a>
@@ -70,14 +70,14 @@
 <div class="tsd-comment tsd-typography"><p>The device index.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L345">runtime.ts:345</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L345">runtime.ts:345</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="deviceType" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>device<wbr/>Type</span><a href="#deviceType" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">device<wbr/>Type</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span></div>
 <div class="tsd-comment tsd-typography"><p>The device type code of the device.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L343">runtime.ts:343</a></li></ul></aside></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L343">runtime.ts:343</a></li></ul></aside></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="sync" class="tsd-anchor"></a>
@@ -89,7 +89,7 @@
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Promise</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">void</span><span class="tsd-signature-symbol">&gt;</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L368">runtime.ts:368</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L368">runtime.ts:368</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="toString" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>to<wbr/>String</span><a href="#toString" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -97,7 +97,7 @@
 <li class="tsd-description">
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">string</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L375">runtime.ts:375</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L375">runtime.ts:375</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/Instance.html b/docs/reference/api/typedoc/classes/Instance.html
index dbfe2723c0..d303bbc880 100644
--- a/docs/reference/api/typedoc/classes/Instance.html
+++ b/docs/reference/api/typedoc/classes/Instance.html
@@ -40,7 +40,7 @@ are not tracked through JS native garbage collection mechanism.</p>
 <ul class="tsd-hierarchy">
 <li><a href="../interfaces/Disposable.html" class="tsd-signature-type tsd-kind-interface">Disposable</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1043">runtime.ts:1043</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1043">runtime.ts:1043</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -140,24 +140,24 @@ a WASI object, or an object containing wasmLibraryProvider field.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="Instance.html" class="tsd-signature-type tsd-kind-class">Instance</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1072">runtime.ts:1072</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1072">runtime.ts:1072</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Properties</h2>
 <section class="tsd-panel tsd-member"><a id="cacheMetadata" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>cache<wbr/>Metadata</span><a href="#cacheMetadata" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">cache<wbr/>Metadata</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type ">Record</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">string</span><span class="tsd-signature-symbol">, </span><span class="tsd-signature-type">any</span><span class="tsd-signature-symbol">&gt;</span><span class="tsd-signature-symbol"> = {}</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1046">runtime.ts:1046</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1046">runtime.ts:1046</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="exports" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>exports</span><a href="#exports" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">exports</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type ">Record</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">string</span><span class="tsd-signature-symbol">, </span><span class="tsd-signature-type ">Function</span><span class="tsd-signature-symbol">&gt;</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1045">runtime.ts:1045</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1045">runtime.ts:1045</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="memory" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>memory</span><a href="#memory" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">memory</span><span class="tsd-signature-symbol">:</span> <a href="_internal_.Memory.html" class="tsd-signature-type tsd-kind-class">Memory</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1044">runtime.ts:1044</a></li></ul></aside></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1044">runtime.ts:1044</a></li></ul></aside></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="applyPresenceAndFrequencyPenalty" class="tsd-anchor"></a>
@@ -193,7 +193,7 @@ token_freqs[i] is the frequency of token_ids[i], for all i. And all token_freqs
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">any</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1772">runtime.ts:1772</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1772">runtime.ts:1772</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="applyRepetitionPenalty" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>apply<wbr/>Repetition<wbr/>Penalty</span><a href="#applyRepetitionPenalty" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -218,7 +218,7 @@ token_freqs[i] is the frequency of token_ids[i], for all i. And all token_freqs
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">any</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1759">runtime.ts:1759</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1759">runtime.ts:1759</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="applySoftmaxWithTemperature" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>apply<wbr/>Softmax<wbr/>With<wbr/>Temperature</span><a href="#applySoftmaxWithTemperature" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -239,7 +239,7 @@ token_freqs[i] is the frequency of token_ids[i], for all i. And all token_freqs
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">any</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1789">runtime.ts:1789</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1789">runtime.ts:1789</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="asyncLoadWebGPUPipelines" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>async<wbr/>Load<wbr/>WebGPUPipelines</span><a href="#asyncLoadWebGPUPipelines" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -256,7 +256,7 @@ token_freqs[i] is the frequency of token_ids[i], for all i. And all token_freqs
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Promise</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">void</span><span class="tsd-signature-symbol">&gt;</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1947">runtime.ts:1947</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1947">runtime.ts:1947</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="attachToCurrentScope" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>attach<wbr/>To<wbr/>Current<wbr/>Scope</span><a href="#attachToCurrentScope" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -283,7 +283,7 @@ token_freqs[i] is the frequency of token_ids[i], for all i. And all token_freqs
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type tsd-kind-type-parameter">T</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1195">runtime.ts:1195</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1195">runtime.ts:1195</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="beginScope" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>begin<wbr/>Scope</span><a href="#beginScope" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -293,7 +293,7 @@ token_freqs[i] is the frequency of token_ids[i], for all i. And all token_freqs
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1154">runtime.ts:1154</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1154">runtime.ts:1154</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="benchmark" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>benchmark</span><a href="#benchmark" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -328,7 +328,7 @@ token_freqs[i] is the frequency of token_ids[i], for all i. And all token_freqs
 <h5><span class="tsd-kind-parameter">repeat</span>: <span class="tsd-signature-type">number</span><span class="tsd-signature-symbol"> = 1</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Promise</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">number</span><span class="tsd-signature-symbol">[]</span><span class="tsd-signature-symbol">&gt;</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1113">runtime.ts:1113</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1113">runtime.ts:1113</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="bindCanvas" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>bind<wbr/>Canvas</span><a href="#bindCanvas" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -345,7 +345,7 @@ token_freqs[i] is the frequency of token_ids[i], for all i. And all token_freqs
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1797">runtime.ts:1797</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1797">runtime.ts:1797</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="clearCanvas" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>clear<wbr/>Canvas</span><a href="#clearCanvas" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -355,7 +355,7 @@ token_freqs[i] is the frequency of token_ids[i], for all i. And all token_freqs
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1828">runtime.ts:1828</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1828">runtime.ts:1828</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="cpu" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>cpu</span><a href="#cpu" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -372,7 +372,7 @@ token_freqs[i] is the frequency of token_ids[i], for all i. And all token_freqs
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="DLDevice.html" class="tsd-signature-type tsd-kind-class">DLDevice</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1657">runtime.ts:1657</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1657">runtime.ts:1657</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="createVirtualMachine" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>create<wbr/>Virtual<wbr/>Machine</span><a href="#createVirtualMachine" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -390,7 +390,7 @@ token_freqs[i] is the frequency of token_ids[i], for all i. And all token_freqs
 <h4 class="tsd-returns-title">Returns <a href="VirtualMachine.html" class="tsd-signature-type tsd-kind-class">VirtualMachine</a></h4><p>The created virtual machime.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1373">runtime.ts:1373</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1373">runtime.ts:1373</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="detachFromCurrentScope" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>detach<wbr/>From<wbr/>Current<wbr/>Scope</span><a href="#detachFromCurrentScope" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -416,7 +416,7 @@ so it won&#39;t be released via auto-release during endscope.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type tsd-kind-type-parameter">T</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1222">runtime.ts:1222</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1222">runtime.ts:1222</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="device" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>device</span><a href="#device" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -438,7 +438,7 @@ so it won&#39;t be released via auto-release during endscope.</p>
 <h4 class="tsd-returns-title">Returns <a href="DLDevice.html" class="tsd-signature-type tsd-kind-class">DLDevice</a></h4><p>The created device.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1649">runtime.ts:1649</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1649">runtime.ts:1649</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="dispose" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>dispose</span><a href="#dispose" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -451,7 +451,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Implementation of <a href="../interfaces/Disposable.html">Disposable</a>.<a href="../interfaces/Disposable.html#dispose">dispose</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1134">runtime.ts:1134</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1134">runtime.ts:1134</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="empty" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>empty</span><a href="#empty" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -477,7 +477,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <a href="NDArray.html" class="tsd-signature-type tsd-kind-class">NDArray</a></h4><p>The created ndarray.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1677">runtime.ts:1677</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1677">runtime.ts:1677</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="endScope" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>end<wbr/>Scope</span><a href="#endScope" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -490,7 +490,7 @@ a value to parent scope.</p>
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1165">runtime.ts:1165</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1165">runtime.ts:1165</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="fetchNDArrayCache" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>fetchNDArray<wbr/>Cache</span><a href="#fetchNDArrayCache" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -516,7 +516,7 @@ a value to parent scope.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Promise</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">any</span><span class="tsd-signature-symbol">&gt;</span></h4><p>The meta data</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1464">runtime.ts:1464</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1464">runtime.ts:1464</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getGlobalFunc" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Global<wbr/>Func</span><a href="#getGlobalFunc" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -534,7 +534,7 @@ a value to parent scope.</p>
 <h4 class="tsd-returns-title">Returns <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></h4><p>The result function.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1310">runtime.ts:1310</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1310">runtime.ts:1310</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getParamsFromCache" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Params<wbr/>From<wbr/>Cache</span><a href="#getParamsFromCache" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -555,7 +555,7 @@ a value to parent scope.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="TVMObject.html" class="tsd-signature-type tsd-kind-class">TVMObject</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1401">runtime.ts:1401</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1401">runtime.ts:1401</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getParamsFromCacheByName" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Params<wbr/>From<wbr/>Cache<wbr/>By<wbr/>Name</span><a href="#getParamsFromCacheByName" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -573,7 +573,7 @@ a value to parent scope.</p>
 <h4 class="tsd-returns-title">Returns <a href="TVMObject.html" class="tsd-signature-type tsd-kind-class">TVMObject</a></h4><p>Parameters read.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1412">runtime.ts:1412</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1412">runtime.ts:1412</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="initWebGPU" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>init<wbr/>WebGPU</span><a href="#initWebGPU" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -590,7 +590,7 @@ a value to parent scope.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L2012">runtime.ts:2012</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L2012">runtime.ts:2012</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="isPackedFunc" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>is<wbr/>Packed<wbr/>Func</span><a href="#isPackedFunc" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -608,7 +608,7 @@ a value to parent scope.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">boolean</span></h4><p>The check result.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1345">runtime.ts:1345</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1345">runtime.ts:1345</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="listGlobalFuncNames" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>list<wbr/>Global<wbr/>Func<wbr/>Names</span><a href="#listGlobalFuncNames" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -619,7 +619,7 @@ a value to parent scope.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">string</span><span class="tsd-signature-symbol">[]</span></h4><p>The name list.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1238">runtime.ts:1238</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1238">runtime.ts:1238</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="makeShapeTuple" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>make<wbr/>Shape<wbr/>Tuple</span><a href="#makeShapeTuple" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -637,7 +637,7 @@ a value to parent scope.</p>
 <h4 class="tsd-returns-title">Returns <a href="TVMObject.html" class="tsd-signature-type tsd-kind-class">TVMObject</a></h4><p>The created shape tuple.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1862">runtime.ts:1862</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1862">runtime.ts:1862</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="makeString" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>make<wbr/>String</span><a href="#makeString" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -655,7 +655,7 @@ a value to parent scope.</p>
 <h4 class="tsd-returns-title">Returns <a href="_internal_.TVMString.html" class="tsd-signature-type tsd-kind-class">TVMString</a></h4><p>The result TVMString.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1853">runtime.ts:1853</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1853">runtime.ts:1853</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="makeTVMArray" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>makeTVMArray</span><a href="#makeTVMArray" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -675,7 +675,7 @@ and needs to b explicitly disposed.</p>
 <h4 class="tsd-returns-title">Returns <a href="TVMArray.html" class="tsd-signature-type tsd-kind-class">TVMArray</a></h4><p>The result array.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1841">runtime.ts:1841</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1841">runtime.ts:1841</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="moveToParentScope" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>move<wbr/>To<wbr/>Parent<wbr/>Scope</span><a href="#moveToParentScope" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -700,7 +700,7 @@ alive when exit the current scope.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type tsd-kind-type-parameter">T</span></h4><p>The input obj.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1208">runtime.ts:1208</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1208">runtime.ts:1208</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="ndarrayCacheClear" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>ndarray<wbr/>Cache<wbr/>Clear</span><a href="#ndarrayCacheClear" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -710,7 +710,7 @@ alive when exit the current scope.</p>
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1452">runtime.ts:1452</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1452">runtime.ts:1452</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="ndarrayCacheGet" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>ndarray<wbr/>Cache<wbr/>Get</span><a href="#ndarrayCacheGet" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -728,7 +728,7 @@ alive when exit the current scope.</p>
 <h4 class="tsd-returns-title">Returns <a href="NDArray.html" class="tsd-signature-type tsd-kind-class">NDArray</a></h4><p>The result.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1425">runtime.ts:1425</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1425">runtime.ts:1425</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="ndarrayCacheRemove" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>ndarray<wbr/>Cache<wbr/>Remove</span><a href="#ndarrayCacheRemove" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -746,7 +746,7 @@ alive when exit the current scope.</p>
 <h4 class="tsd-returns-title">Returns <a href="NDArray.html" class="tsd-signature-type tsd-kind-class">NDArray</a></h4><p>The result.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1434">runtime.ts:1434</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1434">runtime.ts:1434</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="ndarrayCacheUpdate" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>ndarray<wbr/>Cache<wbr/>Update</span><a href="#ndarrayCacheUpdate" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -769,7 +769,7 @@ alive when exit the current scope.</p>
 <h5><span class="tsd-kind-parameter">override</span>: <span class="tsd-signature-type">boolean</span><span class="tsd-signature-symbol"> = false</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1443">runtime.ts:1443</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1443">runtime.ts:1443</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="registerAsyncServerFunc" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>register<wbr/>Async<wbr/>Server<wbr/>Func</span><a href="#registerAsyncServerFunc" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -796,7 +796,7 @@ alive when exit the current scope.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1920">runtime.ts:1920</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1920">runtime.ts:1920</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="registerFunc" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>register<wbr/>Func</span><a href="#registerFunc" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -819,7 +819,7 @@ alive when exit the current scope.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1277">runtime.ts:1277</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1277">runtime.ts:1277</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="registerInitProgressCallback" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>register<wbr/>Init<wbr/>Progress<wbr/>Callback</span><a href="#registerInitProgressCallback" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -836,7 +836,7 @@ alive when exit the current scope.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1390">runtime.ts:1390</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1390">runtime.ts:1390</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="registerObjectConstructor" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>register<wbr/>Object<wbr/>Constructor</span><a href="#registerObjectConstructor" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -861,7 +861,7 @@ alive when exit the current scope.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1899">runtime.ts:1899</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1899">runtime.ts:1899</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="runtimeStatsText" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>runtime<wbr/>Stats<wbr/>Text</span><a href="#runtimeStatsText" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -871,7 +871,7 @@ alive when exit the current scope.</p>
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">string</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1143">runtime.ts:1143</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1143">runtime.ts:1143</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="sampleTopPFromLogits" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>sample<wbr/>TopPFrom<wbr/>Logits</span><a href="#sampleTopPFromLogits" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -897,7 +897,7 @@ alive when exit the current scope.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><p>The sampled index.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1749">runtime.ts:1749</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1749">runtime.ts:1749</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="scalar" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>scalar</span><a href="#scalar" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -919,7 +919,7 @@ alive when exit the current scope.</p>
 <h4 class="tsd-returns-title">Returns <a href="Scalar.html" class="tsd-signature-type tsd-kind-class">Scalar</a></h4><p>The created scalar.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1639">runtime.ts:1639</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1639">runtime.ts:1639</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="setPackedArguments" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>set<wbr/>Packed<wbr/>Arguments</span><a href="#setPackedArguments" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -949,7 +949,7 @@ Allocate new temporary space from the stack if necessary.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L2145">runtime.ts:2145</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L2145">runtime.ts:2145</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="showImage" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>show<wbr/>Image</span><a href="#showImage" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -966,7 +966,7 @@ Allocate new temporary space from the stack if necessary.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1806">runtime.ts:1806</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1806">runtime.ts:1806</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="systemLib" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>system<wbr/>Lib</span><a href="#systemLib" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -978,7 +978,7 @@ System lib is a global module that contains self register functions in startup.<
 <h4 class="tsd-returns-title">Returns <a href="Module.html" class="tsd-signature-type tsd-kind-class">Module</a></h4><p>The system library module.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1231">runtime.ts:1231</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1231">runtime.ts:1231</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="toDLDataType" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>toDLData<wbr/>Type</span><a href="#toDLDataType" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -996,7 +996,7 @@ System lib is a global module that contains self register functions in startup.<
 <h4 class="tsd-returns-title">Returns <a href="DLDataType.html" class="tsd-signature-type tsd-kind-class">DLDataType</a></h4><p>The converted result.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1593">runtime.ts:1593</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1593">runtime.ts:1593</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="toPackedFunc" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>to<wbr/>Packed<wbr/>Func</span><a href="#toPackedFunc" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -1014,7 +1014,7 @@ System lib is a global module that contains self register functions in startup.<
 <h4 class="tsd-returns-title">Returns <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></h4><p>The converted function.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1356">runtime.ts:1356</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1356">runtime.ts:1356</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="typeKey2Index" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>type<wbr/>Key2<wbr/>Index</span><a href="#typeKey2Index" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -1032,7 +1032,7 @@ System lib is a global module that contains self register functions in startup.<
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><p>The corresponding type index.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1871">runtime.ts:1871</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1871">runtime.ts:1871</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="uniform" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>uniform</span><a href="#uniform" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -1062,7 +1062,7 @@ System lib is a global module that contains self register functions in startup.<
 <h4 class="tsd-returns-title">Returns <a href="NDArray.html" class="tsd-signature-type tsd-kind-class">NDArray</a></h4><p>The created ndarray.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1723">runtime.ts:1723</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1723">runtime.ts:1723</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="webgpu" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>webgpu</span><a href="#webgpu" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -1079,7 +1079,7 @@ System lib is a global module that contains self register functions in startup.<
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="DLDevice.html" class="tsd-signature-type tsd-kind-class">DLDevice</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1665">runtime.ts:1665</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1665">runtime.ts:1665</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="withNewScope" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>with<wbr/>New<wbr/>Scope</span><a href="#withNewScope" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -1112,7 +1112,7 @@ System lib is a global module that contains self register functions in startup.<
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type tsd-kind-type-parameter">T</span></h4><p>The result value.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L1179">runtime.ts:1179</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L1179">runtime.ts:1179</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/Module.html b/docs/reference/api/typedoc/classes/Module.html
index 47ac01d106..ac58ad0d51 100644
--- a/docs/reference/api/typedoc/classes/Module.html
+++ b/docs/reference/api/typedoc/classes/Module.html
@@ -27,7 +27,7 @@
 <ul class="tsd-hierarchy">
 <li><a href="../interfaces/Disposable.html" class="tsd-signature-type tsd-kind-interface">Disposable</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L689">runtime.ts:689</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L689">runtime.ts:689</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -73,7 +73,7 @@
 <h4 class="tsd-returns-title">Returns <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></h4></li></ul></li></ul></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="Module.html" class="tsd-signature-type tsd-kind-class">Module</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L694">runtime.ts:694</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L694">runtime.ts:694</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="dispose" class="tsd-anchor"></a>
@@ -88,7 +88,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Implementation of <a href="../interfaces/Disposable.html">Disposable</a>.<a href="../interfaces/Disposable.html#dispose">dispose</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L704">runtime.ts:704</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L704">runtime.ts:704</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getFunction" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Function</span><a href="#getFunction" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -110,7 +110,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></h4><p>The result function.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L732">runtime.ts:732</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L732">runtime.ts:732</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getHandle" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Handle</span><a href="#getHandle" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -128,7 +128,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><p>The handle.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L719">runtime.ts:719</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L719">runtime.ts:719</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="importModule" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>import<wbr/>Module</span><a href="#importModule" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -145,7 +145,7 @@ only the first call will take effect.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L766">runtime.ts:766</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L766">runtime.ts:766</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/NDArray.html b/docs/reference/api/typedoc/classes/NDArray.html
index e88ff8fd0a..051fe0bc11 100644
--- a/docs/reference/api/typedoc/classes/NDArray.html
+++ b/docs/reference/api/typedoc/classes/NDArray.html
@@ -27,7 +27,7 @@
 <ul class="tsd-hierarchy">
 <li><a href="../interfaces/Disposable.html" class="tsd-signature-type tsd-kind-interface">Disposable</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L432">runtime.ts:432</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L432">runtime.ts:432</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -76,7 +76,7 @@
 <h5><span class="tsd-kind-parameter">ctx</span>: <a href="_internal_.RuntimeContext.html" class="tsd-signature-type tsd-kind-class">RuntimeContext</a></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="NDArray.html" class="tsd-signature-type tsd-kind-class">NDArray</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L452">runtime.ts:452</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L452">runtime.ts:452</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Properties</h2>
 <section class="tsd-panel tsd-member"><a id="device" class="tsd-anchor"></a>
@@ -85,35 +85,35 @@
 <div class="tsd-comment tsd-typography"><p>Device of the array.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L442">runtime.ts:442</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L442">runtime.ts:442</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="dtype" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>dtype</span><a href="#dtype" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">dtype</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">string</span></div>
 <div class="tsd-comment tsd-typography"><p>Data type of the array.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L438">runtime.ts:438</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L438">runtime.ts:438</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="isView" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>is<wbr/>View</span><a href="#isView" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">is<wbr/>View</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">boolean</span></div>
 <div class="tsd-comment tsd-typography"><p>Whether it is a temporary view that can become invalid after the call.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L444">runtime.ts:444</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L444">runtime.ts:444</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="ndim" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>ndim</span><a href="#ndim" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">ndim</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span></div>
 <div class="tsd-comment tsd-typography"><p>Number of dimensions.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L436">runtime.ts:436</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L436">runtime.ts:436</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="shape" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>shape</span><a href="#shape" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">shape</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span><span class="tsd-signature-symbol">[]</span></div>
 <div class="tsd-comment tsd-typography"><p>Shape of the array.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L440">runtime.ts:440</a></li></ul></aside></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L440">runtime.ts:440</a></li></ul></aside></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="copyFrom" class="tsd-anchor"></a>
@@ -134,7 +134,7 @@ The number of elements must match.</p>
 <h4 class="tsd-returns-title">Returns <a href="NDArray.html" class="tsd-signature-type tsd-kind-class">NDArray</a></h4><p>this</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L552">runtime.ts:552</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L552">runtime.ts:552</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="copyFromRawBytes" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>copy<wbr/>From<wbr/>Raw<wbr/>Bytes</span><a href="#copyFromRawBytes" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -152,7 +152,7 @@ The number of elements must match.</p>
 <h4 class="tsd-returns-title">Returns <a href="NDArray.html" class="tsd-signature-type tsd-kind-class">NDArray</a></h4><p>this</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L596">runtime.ts:596</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L596">runtime.ts:596</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="dispose" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>dispose</span><a href="#dispose" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -165,7 +165,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Implementation of <a href="../interfaces/Disposable.html">Disposable</a>.<a href="../interfaces/Disposable.html#dispose">dispose</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L537">runtime.ts:537</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L537">runtime.ts:537</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getDataPtr" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Data<wbr/>Ptr</span><a href="#getDataPtr" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -176,7 +176,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><p>The handle.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L530">runtime.ts:530</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L530">runtime.ts:530</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getHandle" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Handle</span><a href="#getHandle" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -194,7 +194,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><p>The handle.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L518">runtime.ts:518</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L518">runtime.ts:518</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="toArray" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>to<wbr/>Array</span><a href="#toArray" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -206,7 +206,7 @@ the dtype of the NDArray.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Uint8Array</span><span class="tsd-signature-symbol"> | </span><span class="tsd-signature-type ">Int32Array</span><span class="tsd-signature-symbol"> | </span><span class="tsd-signature-type ">Float32Array</span><span class="tsd-signature-symbol"> | </span><span class="tsd-signature-type ">Float64Array</span><span class="tsd-signature-symbol"> | </span><span class="tsd-signature-type ">Int8Array</span></h4><p>The resu [...]
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L662">runtime.ts:662</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L662">runtime.ts:662</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="toRawBytes" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>to<wbr/>Raw<wbr/>Bytes</span><a href="#toRawBytes" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -217,7 +217,7 @@ the dtype of the NDArray.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Uint8Array</span></h4><p>The result array.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L631">runtime.ts:631</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L631">runtime.ts:631</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="view" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>view</span><a href="#view" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -235,7 +235,7 @@ the dtype of the NDArray.</p>
 <h4 class="tsd-returns-title">Returns <a href="NDArray.html" class="tsd-signature-type tsd-kind-class">NDArray</a></h4><p>The new sliced ndarray.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L507">runtime.ts:507</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L507">runtime.ts:507</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/RPCServer.html b/docs/reference/api/typedoc/classes/RPCServer.html
index b28737aa00..dcb7dbd90f 100644
--- a/docs/reference/api/typedoc/classes/RPCServer.html
+++ b/docs/reference/api/typedoc/classes/RPCServer.html
@@ -23,7 +23,7 @@
 <ul class="tsd-hierarchy">
 <li><span class="target">RPCServer</span></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L77">rpc_server.ts:77</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L77">rpc_server.ts:77</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -98,7 +98,7 @@
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Promise</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">void</span><span class="tsd-signature-symbol">&gt;</span></h4></li></ul></li></ul></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="RPCServer.html" class="tsd-signature-type tsd-kind-class">RPCServer</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L100">rpc_server.ts:100</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L100">rpc_server.ts:100</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Properties</h2>
 <section class="tsd-panel tsd-member"><a id="getImports" class="tsd-anchor"></a>
@@ -113,12 +113,12 @@
 <li class="tsd-description">
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Record</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">string</span><span class="tsd-signature-symbol">, </span><span class="tsd-signature-type">unknown</span><span class="tsd-signature-symbol">&gt;</span></h4></li></ul></li></ul></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L83">rpc_server.ts:83</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L83">rpc_server.ts:83</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="key" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>key</span><a href="#key" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">key</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">string</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L79">rpc_server.ts:79</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L79">rpc_server.ts:79</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="logger" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>logger</span><a href="#logger" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">logger</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-symbol">(</span><span class="tsd-signature-symbol">(</span><span class="tsd-kind-parameter">msg</span><span class="tsd-signature-symbol">)</span><span class="tsd-signature-symbol"> =&gt; </span><span class="tsd-signature-type">void</span><span class="tsd-signature-symbol">)</span></div>
@@ -136,22 +136,22 @@
 <h5><span class="tsd-kind-parameter">msg</span>: <span class="tsd-signature-type">string</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4></li></ul></li></ul></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L82">rpc_server.ts:82</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L82">rpc_server.ts:82</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="socket" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>socket</span><a href="#socket" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">socket</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type ">WebSocket</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L80">rpc_server.ts:80</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L80">rpc_server.ts:80</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="state" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>state</span><a href="#state" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">state</span><span class="tsd-signature-symbol">:</span> <a href="../enums/_internal_.RPCServerState.html" class="tsd-signature-type tsd-kind-enum">RPCServerState</a><span class="tsd-signature-symbol"> = RPCServerState.InitHeader</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L81">rpc_server.ts:81</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L81">rpc_server.ts:81</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="url" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>url</span><a href="#url" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">url</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">string</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L78">rpc_server.ts:78</a></li></ul></aside></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L78">rpc_server.ts:78</a></li></ul></aside></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/Scalar.html b/docs/reference/api/typedoc/classes/Scalar.html
index bdaf755754..b485c24d3f 100644
--- a/docs/reference/api/typedoc/classes/Scalar.html
+++ b/docs/reference/api/typedoc/classes/Scalar.html
@@ -24,7 +24,7 @@ argument to PackedFunc calls.</p>
 <ul class="tsd-hierarchy">
 <li><span class="target">Scalar</span></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L279">runtime.ts:279</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L279">runtime.ts:279</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -55,7 +55,7 @@ argument to PackedFunc calls.</p>
 <h5><span class="tsd-kind-parameter">dtype</span>: <span class="tsd-signature-type">string</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="Scalar.html" class="tsd-signature-type tsd-kind-class">Scalar</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L285">runtime.ts:285</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L285">runtime.ts:285</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Properties</h2>
 <section class="tsd-panel tsd-member"><a id="dtype" class="tsd-anchor"></a>
@@ -64,14 +64,14 @@ argument to PackedFunc calls.</p>
 <div class="tsd-comment tsd-typography"><p>The data type of the scalar.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L283">runtime.ts:283</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L283">runtime.ts:283</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="value" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>value</span><a href="#value" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">value</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span></div>
 <div class="tsd-comment tsd-typography"><p>The value.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L281">runtime.ts:281</a></li></ul></aside></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L281">runtime.ts:281</a></li></ul></aside></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/TVMArray.html b/docs/reference/api/typedoc/classes/TVMArray.html
index f242bc3ade..22277e639e 100644
--- a/docs/reference/api/typedoc/classes/TVMArray.html
+++ b/docs/reference/api/typedoc/classes/TVMArray.html
@@ -25,7 +25,7 @@
 <ul class="tsd-hierarchy">
 <li><span class="target">TVMArray</span></li></ul></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L863">runtime.ts:863</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L863">runtime.ts:863</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -63,7 +63,7 @@
 <h4 class="tsd-returns-title">Returns <a href="TVMArray.html" class="tsd-signature-type tsd-kind-class">TVMArray</a></h4><aside class="tsd-sources">
 <p>Overrides <a href="TVMObject.html">TVMObject</a>.<a href="TVMObject.html#constructor">constructor</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L864">runtime.ts:864</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L864">runtime.ts:864</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member tsd-is-inherited"><a id="dispose" class="tsd-anchor"></a>
@@ -78,7 +78,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Inherited from <a href="TVMObject.html">TVMObject</a>.<a href="TVMObject.html#dispose">dispose</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L794">runtime.ts:794</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L794">runtime.ts:794</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="get" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get</span><a href="#get" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -96,7 +96,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <a href="../types/_internal_.TVMObjectBase.html" class="tsd-signature-type tsd-kind-type-alias">TVMObjectBase</a></h4><p>The element.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L883">runtime.ts:883</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L883">runtime.ts:883</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member tsd-is-inherited"><a id="getHandle" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Handle</span><a href="#getHandle" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures tsd-is-inherited">
@@ -115,7 +115,7 @@ only the first call will take effect.</p>
 <aside class="tsd-sources">
 <p>Inherited from <a href="TVMObject.html">TVMObject</a>.<a href="TVMObject.html#getHandle">getHandle</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L809">runtime.ts:809</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L809">runtime.ts:809</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="size" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>size</span><a href="#size" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -125,7 +125,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><p>the size of the array.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L875">runtime.ts:875</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L875">runtime.ts:875</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member tsd-is-inherited"><a id="typeIndex" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>type<wbr/>Index</span><a href="#typeIndex" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures tsd-is-inherited">
@@ -136,7 +136,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <p>Inherited from <a href="TVMObject.html">TVMObject</a>.<a href="TVMObject.html#typeIndex">typeIndex</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L817">runtime.ts:817</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L817">runtime.ts:817</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member tsd-is-inherited"><a id="typeKey" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>type<wbr/>Key</span><a href="#typeKey" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures tsd-is-inherited">
@@ -147,7 +147,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">string</span></h4><aside class="tsd-sources">
 <p>Inherited from <a href="TVMObject.html">TVMObject</a>.<a href="TVMObject.html#typeKey">typeKey</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L837">runtime.ts:837</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L837">runtime.ts:837</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/TVMObject.html b/docs/reference/api/typedoc/classes/TVMObject.html
index 0998f6170a..cf47a9356b 100644
--- a/docs/reference/api/typedoc/classes/TVMObject.html
+++ b/docs/reference/api/typedoc/classes/TVMObject.html
@@ -30,7 +30,7 @@
 <ul class="tsd-hierarchy">
 <li><a href="../interfaces/Disposable.html" class="tsd-signature-type tsd-kind-interface">Disposable</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L779">runtime.ts:779</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L779">runtime.ts:779</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -65,7 +65,7 @@
 <h5><span class="tsd-kind-parameter">ctx</span>: <a href="_internal_.RuntimeContext.html" class="tsd-signature-type tsd-kind-class">RuntimeContext</a></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="TVMObject.html" class="tsd-signature-type tsd-kind-class">TVMObject</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L784">runtime.ts:784</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L784">runtime.ts:784</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="dispose" class="tsd-anchor"></a>
@@ -80,7 +80,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Implementation of <a href="../interfaces/Disposable.html">Disposable</a>.<a href="../interfaces/Disposable.html#dispose">dispose</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L794">runtime.ts:794</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L794">runtime.ts:794</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getHandle" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Handle</span><a href="#getHandle" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -98,7 +98,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><p>The handle.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L809">runtime.ts:809</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L809">runtime.ts:809</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="typeIndex" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>type<wbr/>Index</span><a href="#typeIndex" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -108,7 +108,7 @@ only the first call will take effect.</p>
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L817">runtime.ts:817</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L817">runtime.ts:817</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="typeKey" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>type<wbr/>Key</span><a href="#typeKey" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -118,7 +118,7 @@ only the first call will take effect.</p>
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">string</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L837">runtime.ts:837</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L837">runtime.ts:837</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/VirtualMachine.html b/docs/reference/api/typedoc/classes/VirtualMachine.html
index 520f0b71c7..1ad62b391d 100644
--- a/docs/reference/api/typedoc/classes/VirtualMachine.html
+++ b/docs/reference/api/typedoc/classes/VirtualMachine.html
@@ -30,7 +30,7 @@
 <ul class="tsd-hierarchy">
 <li><a href="../interfaces/Disposable.html" class="tsd-signature-type tsd-kind-interface">Disposable</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L918">runtime.ts:918</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L918">runtime.ts:918</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -68,7 +68,7 @@
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="VirtualMachine.html" class="tsd-signature-type tsd-kind-class">VirtualMachine</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L925">runtime.ts:925</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L925">runtime.ts:925</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="dispose" class="tsd-anchor"></a>
@@ -83,7 +83,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Implementation of <a href="../interfaces/Disposable.html">Disposable</a>.<a href="../interfaces/Disposable.html#dispose">dispose</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L938">runtime.ts:938</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L938">runtime.ts:938</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getFunction" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Function</span><a href="#getFunction" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -101,7 +101,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></h4><p>The result function.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L946">runtime.ts:946</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L946">runtime.ts:946</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getInternalModule" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Internal<wbr/>Module</span><a href="#getInternalModule" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -111,7 +111,7 @@ only the first call will take effect.</p>
 </div>
 <h4 class="tsd-returns-title">Returns <a href="Module.html" class="tsd-signature-type tsd-kind-class">Module</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L953">runtime.ts:953</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L953">runtime.ts:953</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/_internal_.CachedCallStack.html b/docs/reference/api/typedoc/classes/_internal_.CachedCallStack.html
index d0f2400c80..c54f2e46bb 100644
--- a/docs/reference/api/typedoc/classes/_internal_.CachedCallStack.html
+++ b/docs/reference/api/typedoc/classes/_internal_.CachedCallStack.html
@@ -36,7 +36,7 @@ can still call into storeXX</li>
 <ul class="tsd-hierarchy">
 <li><a href="../interfaces/Disposable.html" class="tsd-signature-type tsd-kind-interface">Disposable</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L206">memory.ts:206</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L206">memory.ts:206</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -86,7 +86,7 @@ can still call into storeXX</li>
 <h5><span class="tsd-kind-parameter">freeSpace</span>: <a href="../types/_internal_.FTVMWasmFreeSpace.html" class="tsd-signature-type tsd-kind-type-alias">FTVMWasmFreeSpace</a></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="_internal_.CachedCallStack.html" class="tsd-signature-type tsd-kind-class">CachedCallStack</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L225">memory.ts:225</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L225">memory.ts:225</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Properties</h2>
 <section class="tsd-panel tsd-member"><a id="tempArgs" class="tsd-anchor"></a>
@@ -95,7 +95,7 @@ can still call into storeXX</li>
 <div class="tsd-comment tsd-typography"><p>List of temporay arguments that can be disposed during reset.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L208">memory.ts:208</a></li></ul></aside></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L208">memory.ts:208</a></li></ul></aside></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="allocPtrArray" class="tsd-anchor"></a>
@@ -115,7 +115,7 @@ can still call into storeXX</li>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><p>The allocated pointer array.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L312">memory.ts:312</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L312">memory.ts:312</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="allocRawBytes" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>alloc<wbr/>Raw<wbr/>Bytes</span><a href="#allocRawBytes" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -134,7 +134,7 @@ can still call into storeXX</li>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L284">memory.ts:284</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L284">memory.ts:284</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="allocThenSetArgBytes" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>alloc<wbr/>Then<wbr/>Set<wbr/>Arg<wbr/>Bytes</span><a href="#allocThenSetArgBytes" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -156,7 +156,7 @@ Allocate new temporary space for bytes.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L389">memory.ts:389</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L389">memory.ts:389</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="allocThenSetArgString" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>alloc<wbr/>Then<wbr/>Set<wbr/>Arg<wbr/>String</span><a href="#allocThenSetArgString" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -180,7 +180,7 @@ and will be filled when we commit the data.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L377">memory.ts:377</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L377">memory.ts:377</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="commitToWasmMemory" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>commit<wbr/>To<wbr/>Wasm<wbr/>Memory</span><a href="#commitToWasmMemory" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -199,7 +199,7 @@ No further store function should be called.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L267">memory.ts:267</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L267">memory.ts:267</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="dispose" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>dispose</span><a href="#dispose" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -212,7 +212,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Implementation of <a href="../interfaces/Disposable.html">Disposable</a>.<a href="../interfaces/Disposable.html#dispose">dispose</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L243">memory.ts:243</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L243">memory.ts:243</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="ptrFromOffset" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>ptr<wbr/>From<wbr/>Offset</span><a href="#ptrFromOffset" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -230,7 +230,7 @@ Note that the returned value becomes obsolete if alloc is called on the stack.</
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L321">memory.ts:321</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L321">memory.ts:321</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="reset" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>reset</span><a href="#reset" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -240,7 +240,7 @@ Note that the returned value becomes obsolete if alloc is called on the stack.</
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L252">memory.ts:252</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L252">memory.ts:252</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="storeF64" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>store<wbr/>F64</span><a href="#storeF64" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -255,7 +255,7 @@ Note that the returned value becomes obsolete if alloc is called on the stack.</
 <h5><span class="tsd-kind-parameter">value</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L360">memory.ts:360</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L360">memory.ts:360</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="storeI32" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>store<wbr/>I32</span><a href="#storeI32" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -270,7 +270,7 @@ Note that the returned value becomes obsolete if alloc is called on the stack.</
 <h5><span class="tsd-kind-parameter">value</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L342">memory.ts:342</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L342">memory.ts:342</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="storeI64" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>store<wbr/>I64</span><a href="#storeI64" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -285,7 +285,7 @@ Note that the returned value becomes obsolete if alloc is called on the stack.</
 <h5><span class="tsd-kind-parameter">value</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L350">memory.ts:350</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L350">memory.ts:350</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="storePtr" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>store<wbr/>Ptr</span><a href="#storePtr" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -300,7 +300,7 @@ Note that the returned value becomes obsolete if alloc is called on the stack.</
 <h5><span class="tsd-kind-parameter">value</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L326">memory.ts:326</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L326">memory.ts:326</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="storeRawBytes" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>store<wbr/>Raw<wbr/>Bytes</span><a href="#storeRawBytes" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -315,7 +315,7 @@ Note that the returned value becomes obsolete if alloc is called on the stack.</
 <h5><span class="tsd-kind-parameter">bytes</span>: <span class="tsd-signature-type ">Uint8Array</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L364">memory.ts:364</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L364">memory.ts:364</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="storeU32" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>store<wbr/>U32</span><a href="#storeU32" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -330,7 +330,7 @@ Note that the returned value becomes obsolete if alloc is called on the stack.</
 <h5><span class="tsd-kind-parameter">value</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L346">memory.ts:346</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L346">memory.ts:346</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="storeUSize" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>storeUSize</span><a href="#storeUSize" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -345,7 +345,7 @@ Note that the returned value becomes obsolete if alloc is called on the stack.</
 <h5><span class="tsd-kind-parameter">value</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L334">memory.ts:334</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L334">memory.ts:334</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/_internal_.CanvasRenderManager.html b/docs/reference/api/typedoc/classes/_internal_.CanvasRenderManager.html
index 0e22b3c1ec..39bc245214 100644
--- a/docs/reference/api/typedoc/classes/_internal_.CanvasRenderManager.html
+++ b/docs/reference/api/typedoc/classes/_internal_.CanvasRenderManager.html
@@ -29,7 +29,7 @@ which needs to be explicitly disposed.</p>
 <ul class="tsd-hierarchy">
 <li><a href="../interfaces/Disposable.html" class="tsd-signature-type tsd-kind-interface">Disposable</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L168">webgpu.ts:168</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L168">webgpu.ts:168</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -61,7 +61,7 @@ which needs to be explicitly disposed.</p>
 <h5><span class="tsd-kind-parameter">canvas</span>: <span class="tsd-signature-type ">HTMLCanvasElement</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="_internal_.CanvasRenderManager.html" class="tsd-signature-type tsd-kind-class">CanvasRenderManager</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L177">webgpu.ts:177</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L177">webgpu.ts:177</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="clear" class="tsd-anchor"></a>
@@ -71,7 +71,7 @@ which needs to be explicitly disposed.</p>
 <li class="tsd-description">
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L251">webgpu.ts:251</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L251">webgpu.ts:251</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="dispose" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>dispose</span><a href="#dispose" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -84,7 +84,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Implementation of <a href="../interfaces/Disposable.html">Disposable</a>.<a href="../interfaces/Disposable.html#dispose">dispose</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L327">webgpu.ts:327</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L327">webgpu.ts:327</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="draw" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>draw</span><a href="#draw" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -101,7 +101,7 @@ only the first call will take effect.</p>
 <h5><span class="tsd-kind-parameter">width</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L277">webgpu.ts:277</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L277">webgpu.ts:277</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/_internal_.Environment.html b/docs/reference/api/typedoc/classes/_internal_.Environment.html
index 499c694f6f..b1c5b87ed3 100644
--- a/docs/reference/api/typedoc/classes/_internal_.Environment.html
+++ b/docs/reference/api/typedoc/classes/_internal_.Environment.html
@@ -28,7 +28,7 @@
 <ul class="tsd-hierarchy">
 <li><a href="../interfaces/LibraryProvider.html" class="tsd-signature-type tsd-kind-interface">LibraryProvider</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/environment.ts#L68">environment.ts:68</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/environment.ts#L68">environment.ts:68</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -76,7 +76,7 @@
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4></li></ul></li></ul></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="_internal_.Environment.html" class="tsd-signature-type tsd-kind-class">Environment</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/environment.ts#L88">environment.ts:88</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/environment.ts#L88">environment.ts:88</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Properties</h2>
 <section class="tsd-panel tsd-member"><a id="imports" class="tsd-anchor"></a>
@@ -86,7 +86,7 @@
 </div><aside class="tsd-sources">
 <p>Implementation of <a href="../interfaces/LibraryProvider.html">LibraryProvider</a>.<a href="../interfaces/LibraryProvider.html#imports">imports</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/environment.ts#L70">environment.ts:70</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/environment.ts#L70">environment.ts:70</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="logger" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>logger</span><a href="#logger" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">logger</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-symbol">(</span><span class="tsd-signature-symbol">(</span><span class="tsd-kind-parameter">msg</span><span class="tsd-signature-symbol">)</span><span class="tsd-signature-symbol"> =&gt; </span><span class="tsd-signature-type">void</span><span class="tsd-signature-symbol">)</span></div>
@@ -104,7 +104,7 @@
 <h5><span class="tsd-kind-parameter">msg</span>: <span class="tsd-signature-type">string</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4></li></ul></li></ul></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/environment.ts#L69">environment.ts:69</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/environment.ts#L69">environment.ts:69</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="packedCFuncTable" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>packedCFunc<wbr/>Table</span><a href="#packedCFuncTable" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">packedCFunc<wbr/>Table</span><span class="tsd-signature-symbol">:</span> <a href="../types/_internal_.FTVMWasmPackedCFunc.html" class="tsd-signature-type tsd-kind-type-alias">FTVMWasmPackedCFunc</a><span class="tsd-signature-symbol">[]</span><span class="tsd-signature-symbol"> = ...</span></div>
@@ -114,14 +114,14 @@ can call via TVMWasmPackedCFunc.</p>
 of functions that do not maps to the address space.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/environment.ts#L78">environment.ts:78</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/environment.ts#L78">environment.ts:78</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="packedCFuncTableFreeId" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>packedCFunc<wbr/>Table<wbr/>Free<wbr/>Id</span><a href="#packedCFuncTableFreeId" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">packedCFunc<wbr/>Table<wbr/>Free<wbr/>Id</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span><span class="tsd-signature-symbol">[]</span><span class="tsd-signature-symbol"> = []</span></div>
 <div class="tsd-comment tsd-typography"><p>Free table index that can be recycled.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/environment.ts#L84">environment.ts:84</a></li></ul></aside></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/environment.ts#L84">environment.ts:84</a></li></ul></aside></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="start" class="tsd-anchor"></a>
@@ -139,7 +139,7 @@ of functions that do not maps to the address space.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Implementation of <a href="../interfaces/LibraryProvider.html">LibraryProvider</a>.<a href="../interfaces/LibraryProvider.html#start">start</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/environment.ts#L105">environment.ts:105</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/environment.ts#L105">environment.ts:105</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/_internal_.FFILibrary.html b/docs/reference/api/typedoc/classes/_internal_.FFILibrary.html
index 42214c80be..149a93477f 100644
--- a/docs/reference/api/typedoc/classes/_internal_.FFILibrary.html
+++ b/docs/reference/api/typedoc/classes/_internal_.FFILibrary.html
@@ -28,7 +28,7 @@
 <ul class="tsd-hierarchy">
 <li><a href="../interfaces/Disposable.html" class="tsd-signature-type tsd-kind-interface">Disposable</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L43">runtime.ts:43</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L43">runtime.ts:43</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -69,29 +69,29 @@
 <h5><span class="tsd-kind-parameter">imports</span>: <span class="tsd-signature-type ">Record</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">string</span><span class="tsd-signature-symbol">, </span><span class="tsd-signature-type">any</span><span class="tsd-signature-symbol">&gt;</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="_internal_.FFILibrary.html" class="tsd-signature-type tsd-kind-class">FFILibrary</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L51">runtime.ts:51</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L51">runtime.ts:51</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Properties</h2>
 <section class="tsd-panel tsd-member"><a id="exports" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>exports</span><a href="#exports" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">exports</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type ">Record</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">string</span><span class="tsd-signature-symbol">, </span><span class="tsd-signature-type ">Function</span><span class="tsd-signature-symbol">&gt;</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L46">runtime.ts:46</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L46">runtime.ts:46</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="memory" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>memory</span><a href="#memory" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">memory</span><span class="tsd-signature-symbol">:</span> <a href="_internal_.Memory.html" class="tsd-signature-type tsd-kind-class">Memory</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L45">runtime.ts:45</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L45">runtime.ts:45</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="wasm32" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>wasm32</span><a href="#wasm32" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">wasm32</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">boolean</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L44">runtime.ts:44</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L44">runtime.ts:44</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="webGPUContext" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><code class="tsd-tag ts-flagOptional">Optional</code> <span>webGPUContext</span><a href="#webGPUContext" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">webGPUContext</span><span class="tsd-signature-symbol">?:</span> <a href="_internal_.WebGPUContext.html" class="tsd-signature-type tsd-kind-class">WebGPUContext</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L47">runtime.ts:47</a></li></ul></aside></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L47">runtime.ts:47</a></li></ul></aside></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="checkCall" class="tsd-anchor"></a>
@@ -106,7 +106,7 @@
 <h5><span class="tsd-kind-parameter">code</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L77">runtime.ts:77</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L77">runtime.ts:77</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="dispose" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>dispose</span><a href="#dispose" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -119,7 +119,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Implementation of <a href="../interfaces/Disposable.html">Disposable</a>.<a href="../interfaces/Disposable.html#dispose">dispose</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L66">runtime.ts:66</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L66">runtime.ts:66</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getOrAllocCallStack" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Or<wbr/>Alloc<wbr/>Call<wbr/>Stack</span><a href="#getOrAllocCallStack" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -127,7 +127,7 @@ only the first call will take effect.</p>
 <li class="tsd-description">
 <h4 class="tsd-returns-title">Returns <a href="_internal_.CachedCallStack.html" class="tsd-signature-type tsd-kind-class">CachedCallStack</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L86">runtime.ts:86</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L86">runtime.ts:86</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="recycleCallStack" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>recycle<wbr/>Call<wbr/>Stack</span><a href="#recycleCallStack" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -140,7 +140,7 @@ only the first call will take effect.</p>
 <h5><span class="tsd-kind-parameter">callstack</span>: <a href="_internal_.CachedCallStack.html" class="tsd-signature-type tsd-kind-class">CachedCallStack</a></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L97">runtime.ts:97</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L97">runtime.ts:97</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="sizeofPtr" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>sizeof<wbr/>Ptr</span><a href="#sizeofPtr" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -148,7 +148,7 @@ only the first call will take effect.</p>
 <li class="tsd-description">
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L73">runtime.ts:73</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L73">runtime.ts:73</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/_internal_.Memory.html b/docs/reference/api/typedoc/classes/_internal_.Memory.html
index 6813865753..18b8ad0b4c 100644
--- a/docs/reference/api/typedoc/classes/_internal_.Memory.html
+++ b/docs/reference/api/typedoc/classes/_internal_.Memory.html
@@ -24,7 +24,7 @@
 <ul class="tsd-hierarchy">
 <li><span class="target">Memory</span></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L31">memory.ts:31</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L31">memory.ts:31</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -70,19 +70,19 @@
 <h5><span class="tsd-kind-parameter">memory</span>: <span class="tsd-signature-type ">Memory</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="_internal_.Memory.html" class="tsd-signature-type tsd-kind-class">Memory</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L42">memory.ts:42</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L42">memory.ts:42</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Properties</h2>
 <section class="tsd-panel tsd-member"><a id="memory" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>memory</span><a href="#memory" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">memory</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type ">Memory</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L32">memory.ts:32</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L32">memory.ts:32</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="wasm32" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>wasm32</span><a href="#wasm32" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">wasm32</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">boolean</span><span class="tsd-signature-symbol"> = true</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L33">memory.ts:33</a></li></ul></aside></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L33">memory.ts:33</a></li></ul></aside></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="loadCString" class="tsd-anchor"></a>
@@ -101,7 +101,7 @@
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">string</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L154">memory.ts:154</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L154">memory.ts:154</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="loadF32" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>load<wbr/>F32</span><a href="#loadF32" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -114,7 +114,7 @@
 <h5><span class="tsd-kind-parameter">ptr</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L90">memory.ts:90</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L90">memory.ts:90</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="loadF64" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>load<wbr/>F64</span><a href="#loadF64" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -127,7 +127,7 @@
 <h5><span class="tsd-kind-parameter">ptr</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L97">memory.ts:97</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L97">memory.ts:97</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="loadI32" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>load<wbr/>I32</span><a href="#loadI32" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -140,7 +140,7 @@
 <h5><span class="tsd-kind-parameter">ptr</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L74">memory.ts:74</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L74">memory.ts:74</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="loadI64" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>load<wbr/>I64</span><a href="#loadI64" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -153,7 +153,7 @@
 <h5><span class="tsd-kind-parameter">ptr</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L81">memory.ts:81</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L81">memory.ts:81</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="loadPointer" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>load<wbr/>Pointer</span><a href="#loadPointer" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -166,7 +166,7 @@
 <h5><span class="tsd-kind-parameter">ptr</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L104">memory.ts:104</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L104">memory.ts:104</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="loadRawBytes" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>load<wbr/>Raw<wbr/>Bytes</span><a href="#loadRawBytes" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -187,7 +187,7 @@
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Uint8Array</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L132">memory.ts:132</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L132">memory.ts:132</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="loadTVMBytes" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>loadTVMBytes</span><a href="#loadTVMBytes" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -204,7 +204,7 @@
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Uint8Array</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L145">memory.ts:145</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L145">memory.ts:145</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="loadU16" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>load<wbr/>U16</span><a href="#loadU16" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -217,7 +217,7 @@
 <h5><span class="tsd-kind-parameter">ptr</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L60">memory.ts:60</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L60">memory.ts:60</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="loadU32" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>load<wbr/>U32</span><a href="#loadU32" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -230,7 +230,7 @@
 <h5><span class="tsd-kind-parameter">ptr</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L67">memory.ts:67</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L67">memory.ts:67</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="loadU8" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>load<wbr/>U8</span><a href="#loadU8" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -243,7 +243,7 @@
 <h5><span class="tsd-kind-parameter">ptr</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L53">memory.ts:53</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L53">memory.ts:53</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="loadUSize" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>loadUSize</span><a href="#loadUSize" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -256,7 +256,7 @@
 <h5><span class="tsd-kind-parameter">ptr</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L114">memory.ts:114</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L114">memory.ts:114</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="sizeofPtr" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>sizeof<wbr/>Ptr</span><a href="#sizeofPtr" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -264,7 +264,7 @@
 <li class="tsd-description">
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L124">memory.ts:124</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L124">memory.ts:124</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="storeRawBytes" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>store<wbr/>Raw<wbr/>Bytes</span><a href="#storeRawBytes" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -285,7 +285,7 @@
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/memory.ts#L175">memory.ts:175</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/memory.ts#L175">memory.ts:175</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/_internal_.PackedFuncCell.html b/docs/reference/api/typedoc/classes/_internal_.PackedFuncCell.html
index 6f0aa46ec0..a52f7d9a6f 100644
--- a/docs/reference/api/typedoc/classes/_internal_.PackedFuncCell.html
+++ b/docs/reference/api/typedoc/classes/_internal_.PackedFuncCell.html
@@ -28,7 +28,7 @@
 <ul class="tsd-hierarchy">
 <li><a href="../interfaces/Disposable.html" class="tsd-signature-type tsd-kind-interface">Disposable</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L294">runtime.ts:294</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L294">runtime.ts:294</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -59,7 +59,7 @@
 <h5><span class="tsd-kind-parameter">lib</span>: <a href="_internal_.FFILibrary.html" class="tsd-signature-type tsd-kind-class">FFILibrary</a></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="_internal_.PackedFuncCell.html" class="tsd-signature-type tsd-kind-class">PackedFuncCell</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L298">runtime.ts:298</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L298">runtime.ts:298</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="dispose" class="tsd-anchor"></a>
@@ -74,7 +74,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Implementation of <a href="../interfaces/Disposable.html">Disposable</a>.<a href="../interfaces/Disposable.html#dispose">dispose</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L303">runtime.ts:303</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L303">runtime.ts:303</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getHandle" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Handle</span><a href="#getHandle" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -87,7 +87,7 @@ only the first call will take effect.</p>
 <h5><span class="tsd-kind-parameter">requireNotNull</span>: <span class="tsd-signature-type">boolean</span><span class="tsd-signature-symbol"> = true</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L312">runtime.ts:312</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L312">runtime.ts:312</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/_internal_.RuntimeContext.html b/docs/reference/api/typedoc/classes/_internal_.RuntimeContext.html
index 7c2587c9a5..4f9b403811 100644
--- a/docs/reference/api/typedoc/classes/_internal_.RuntimeContext.html
+++ b/docs/reference/api/typedoc/classes/_internal_.RuntimeContext.html
@@ -28,7 +28,7 @@
 <ul class="tsd-hierarchy">
 <li><a href="../interfaces/Disposable.html" class="tsd-signature-type tsd-kind-interface">Disposable</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L143">runtime.ts:143</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L143">runtime.ts:143</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -94,104 +94,104 @@
 <h4 class="tsd-returns-title">Returns <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></h4></li></ul></li></ul></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="_internal_.RuntimeContext.html" class="tsd-signature-type tsd-kind-class">RuntimeContext</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L166">runtime.ts:166</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L166">runtime.ts:166</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Properties</h2>
 <section class="tsd-panel tsd-member"><a id="applyPresenceAndFrequencyPenalty" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>apply<wbr/>Presence<wbr/>And<wbr/>Frequency<wbr/>Penalty</span><a href="#applyPresenceAndFrequencyPenalty" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">apply<wbr/>Presence<wbr/>And<wbr/>Frequency<wbr/>Penalty</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L161">runtime.ts:161</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L161">runtime.ts:161</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="applyRepetitionPenalty" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>apply<wbr/>Repetition<wbr/>Penalty</span><a href="#applyRepetitionPenalty" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">apply<wbr/>Repetition<wbr/>Penalty</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L160">runtime.ts:160</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L160">runtime.ts:160</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="applySoftmaxWithTemperature" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>apply<wbr/>Softmax<wbr/>With<wbr/>Temperature</span><a href="#applySoftmaxWithTemperature" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">apply<wbr/>Softmax<wbr/>With<wbr/>Temperature</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L162">runtime.ts:162</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L162">runtime.ts:162</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="arrayCacheClear" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>array<wbr/>Cache<wbr/>Clear</span><a href="#arrayCacheClear" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">array<wbr/>Cache<wbr/>Clear</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L153">runtime.ts:153</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L153">runtime.ts:153</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="arrayCacheGet" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>array<wbr/>Cache<wbr/>Get</span><a href="#arrayCacheGet" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">array<wbr/>Cache<wbr/>Get</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L150">runtime.ts:150</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L150">runtime.ts:150</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="arrayCacheRemove" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>array<wbr/>Cache<wbr/>Remove</span><a href="#arrayCacheRemove" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">array<wbr/>Cache<wbr/>Remove</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L152">runtime.ts:152</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L152">runtime.ts:152</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="arrayCacheUpdate" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>array<wbr/>Cache<wbr/>Update</span><a href="#arrayCacheUpdate" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">array<wbr/>Cache<wbr/>Update</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L151">runtime.ts:151</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L151">runtime.ts:151</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="arrayDecodeStorage" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>array<wbr/>Decode<wbr/>Storage</span><a href="#arrayDecodeStorage" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">array<wbr/>Decode<wbr/>Storage</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L154">runtime.ts:154</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L154">runtime.ts:154</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="arrayGetItem" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>array<wbr/>Get<wbr/>Item</span><a href="#arrayGetItem" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">array<wbr/>Get<wbr/>Item</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L144">runtime.ts:144</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L144">runtime.ts:144</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="arrayGetSize" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>array<wbr/>Get<wbr/>Size</span><a href="#arrayGetSize" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">array<wbr/>Get<wbr/>Size</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L145">runtime.ts:145</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L145">runtime.ts:145</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="arrayMake" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>array<wbr/>Make</span><a href="#arrayMake" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">array<wbr/>Make</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L146">runtime.ts:146</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L146">runtime.ts:146</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="getFFIString" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>getFFIString</span><a href="#getFFIString" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">getFFIString</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L148">runtime.ts:148</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L148">runtime.ts:148</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="getSysLib" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Sys<wbr/>Lib</span><a href="#getSysLib" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">get<wbr/>Sys<wbr/>Lib</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L149">runtime.ts:149</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L149">runtime.ts:149</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="makeShapeTuple" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>make<wbr/>Shape<wbr/>Tuple</span><a href="#makeShapeTuple" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">make<wbr/>Shape<wbr/>Tuple</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L157">runtime.ts:157</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L157">runtime.ts:157</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="ndarrayCreateView" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>ndarray<wbr/>Create<wbr/>View</span><a href="#ndarrayCreateView" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">ndarray<wbr/>Create<wbr/>View</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L158">runtime.ts:158</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L158">runtime.ts:158</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="paramModuleFromCache" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>param<wbr/>Module<wbr/>From<wbr/>Cache</span><a href="#paramModuleFromCache" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">param<wbr/>Module<wbr/>From<wbr/>Cache</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L155">runtime.ts:155</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L155">runtime.ts:155</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="paramModuleFromCacheByName" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>param<wbr/>Module<wbr/>From<wbr/>Cache<wbr/>By<wbr/>Name</span><a href="#paramModuleFromCacheByName" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">param<wbr/>Module<wbr/>From<wbr/>Cache<wbr/>By<wbr/>Name</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L156">runtime.ts:156</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L156">runtime.ts:156</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="sampleTopPFromLogits" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>sample<wbr/>TopPFrom<wbr/>Logits</span><a href="#sampleTopPFromLogits" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">sample<wbr/>TopPFrom<wbr/>Logits</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L159">runtime.ts:159</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L159">runtime.ts:159</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="stringMake" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>string<wbr/>Make</span><a href="#stringMake" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">string<wbr/>Make</span><span class="tsd-signature-symbol">:</span> <a href="../types/PackedFunc.html" class="tsd-signature-type tsd-kind-type-alias">PackedFunc</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L147">runtime.ts:147</a></li></ul></aside></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L147">runtime.ts:147</a></li></ul></aside></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="attachToCurrentScope" class="tsd-anchor"></a>
@@ -219,7 +219,7 @@
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type tsd-kind-type-parameter">T</span></h4><p>the same object.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L237">runtime.ts:237</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L237">runtime.ts:237</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="beginScope" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>begin<wbr/>Scope</span><a href="#beginScope" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -227,7 +227,7 @@
 <li class="tsd-description">
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L211">runtime.ts:211</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L211">runtime.ts:211</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="detachFromCurrentScope" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>detach<wbr/>From<wbr/>Current<wbr/>Scope</span><a href="#detachFromCurrentScope" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -245,7 +245,7 @@
 <h5><span class="tsd-kind-parameter">obj</span>: <span class="tsd-signature-type tsd-kind-type-parameter">T</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type tsd-kind-type-parameter">T</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L256">runtime.ts:256</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L256">runtime.ts:256</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="dispose" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>dispose</span><a href="#dispose" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -258,7 +258,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Implementation of <a href="../interfaces/Disposable.html">Disposable</a>.<a href="../interfaces/Disposable.html#dispose">dispose</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L188">runtime.ts:188</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L188">runtime.ts:188</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="endScope" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>end<wbr/>Scope</span><a href="#endScope" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -266,7 +266,7 @@ only the first call will take effect.</p>
 <li class="tsd-description">
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L215">runtime.ts:215</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L215">runtime.ts:215</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="moveToParentScope" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>move<wbr/>To<wbr/>Parent<wbr/>Scope</span><a href="#moveToParentScope" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -284,7 +284,7 @@ only the first call will take effect.</p>
 <h5><span class="tsd-kind-parameter">obj</span>: <span class="tsd-signature-type tsd-kind-type-parameter">T</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type tsd-kind-type-parameter">T</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L246">runtime.ts:246</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L246">runtime.ts:246</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/_internal_.TVMString.html b/docs/reference/api/typedoc/classes/_internal_.TVMString.html
index a3046cf778..ed1624cf66 100644
--- a/docs/reference/api/typedoc/classes/_internal_.TVMString.html
+++ b/docs/reference/api/typedoc/classes/_internal_.TVMString.html
@@ -26,7 +26,7 @@
 <ul class="tsd-hierarchy">
 <li><span class="target">TVMString</span></li></ul></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L889">runtime.ts:889</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L889">runtime.ts:889</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -63,7 +63,7 @@
 <h4 class="tsd-returns-title">Returns <a href="_internal_.TVMString.html" class="tsd-signature-type tsd-kind-class">TVMString</a></h4><aside class="tsd-sources">
 <p>Overrides <a href="TVMObject.html">TVMObject</a>.<a href="TVMObject.html#constructor">constructor</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L890">runtime.ts:890</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L890">runtime.ts:890</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member tsd-is-inherited"><a id="dispose" class="tsd-anchor"></a>
@@ -78,7 +78,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <p>Inherited from <a href="TVMObject.html">TVMObject</a>.<a href="TVMObject.html#dispose">dispose</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L794">runtime.ts:794</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L794">runtime.ts:794</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member tsd-is-inherited"><a id="getHandle" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>Handle</span><a href="#getHandle" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures tsd-is-inherited">
@@ -97,7 +97,7 @@ only the first call will take effect.</p>
 <aside class="tsd-sources">
 <p>Inherited from <a href="TVMObject.html">TVMObject</a>.<a href="TVMObject.html#getHandle">getHandle</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L809">runtime.ts:809</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L809">runtime.ts:809</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="toString" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>to<wbr/>String</span><a href="#toString" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -107,7 +107,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">string</span></h4><p>the size of the array.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L901">runtime.ts:901</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L901">runtime.ts:901</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member tsd-is-inherited"><a id="typeIndex" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>type<wbr/>Index</span><a href="#typeIndex" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures tsd-is-inherited">
@@ -118,7 +118,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4><aside class="tsd-sources">
 <p>Inherited from <a href="TVMObject.html">TVMObject</a>.<a href="TVMObject.html#typeIndex">typeIndex</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L817">runtime.ts:817</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L817">runtime.ts:817</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member tsd-is-inherited"><a id="typeKey" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>type<wbr/>Key</span><a href="#typeKey" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures tsd-is-inherited">
@@ -129,7 +129,7 @@ only the first call will take effect.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">string</span></h4><aside class="tsd-sources">
 <p>Inherited from <a href="TVMObject.html">TVMObject</a>.<a href="TVMObject.html#typeKey">typeKey</a></p>
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L837">runtime.ts:837</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L837">runtime.ts:837</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/classes/_internal_.WebGPUContext.html b/docs/reference/api/typedoc/classes/_internal_.WebGPUContext.html
index 29bb12ff3d..adbd25c264 100644
--- a/docs/reference/api/typedoc/classes/_internal_.WebGPUContext.html
+++ b/docs/reference/api/typedoc/classes/_internal_.WebGPUContext.html
@@ -25,7 +25,7 @@ Manages all the webgpu resources here.</p>
 <ul class="tsd-hierarchy">
 <li><span class="target">WebGPUContext</span></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L345">webgpu.ts:345</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L345">webgpu.ts:345</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -69,19 +69,19 @@ Manages all the webgpu resources here.</p>
 <h5><span class="tsd-kind-parameter">device</span>: <span class="tsd-signature-type ">GPUDevice</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="_internal_.WebGPUContext.html" class="tsd-signature-type tsd-kind-class">WebGPUContext</a></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L370">webgpu.ts:370</a></li></ul></aside></li></ul></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L370">webgpu.ts:370</a></li></ul></aside></li></ul></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Properties</h2>
 <section class="tsd-panel tsd-member"><a id="device" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>device</span><a href="#device" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">device</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type ">GPUDevice</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L346">webgpu.ts:346</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L346">webgpu.ts:346</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="memory" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>memory</span><a href="#memory" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">memory</span><span class="tsd-signature-symbol">:</span> <a href="_internal_.Memory.html" class="tsd-signature-type tsd-kind-class">Memory</a></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L347">webgpu.ts:347</a></li></ul></aside></section></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L347">webgpu.ts:347</a></li></ul></aside></section></section>
 <section class="tsd-panel-group tsd-member-group">
 <h2>Methods</h2>
 <section class="tsd-panel tsd-member"><a id="bindCanvas" class="tsd-anchor"></a>
@@ -100,7 +100,7 @@ Manages all the webgpu resources here.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L454">webgpu.ts:454</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L454">webgpu.ts:454</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="clearCanvas" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>clear<wbr/>Canvas</span><a href="#clearCanvas" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -110,7 +110,7 @@ Manages all the webgpu resources here.</p>
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L446">webgpu.ts:446</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L446">webgpu.ts:446</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="copyRawBytesToBuffer" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>copy<wbr/>Raw<wbr/>Bytes<wbr/>To<wbr/>Buffer</span><a href="#copyRawBytesToBuffer" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -139,7 +139,7 @@ Manages all the webgpu resources here.</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L428">webgpu.ts:428</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L428">webgpu.ts:428</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="createShader" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>create<wbr/>Shader</span><a href="#createShader" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -160,7 +160,7 @@ via createComputePipeline</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Function</span></h4><p>The shader</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L466">webgpu.ts:466</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L466">webgpu.ts:466</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="createShaderAsync" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>create<wbr/>Shader<wbr/>Async</span><a href="#createShaderAsync" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -181,7 +181,7 @@ via createComputePipelineAsync</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Promise</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type ">Function</span><span class="tsd-signature-symbol">&gt;</span></h4><p>The shader</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L478">webgpu.ts:478</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L478">webgpu.ts:478</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="dispose" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>dispose</span><a href="#dispose" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -191,7 +191,7 @@ via createComputePipelineAsync</p>
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L378">webgpu.ts:378</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L378">webgpu.ts:378</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="drawImageFromBuffer" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>draw<wbr/>Image<wbr/>From<wbr/>Buffer</span><a href="#drawImageFromBuffer" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -216,7 +216,7 @@ via createComputePipelineAsync</p>
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L413">webgpu.ts:413</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L413">webgpu.ts:413</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="getDeviceAPI" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>get<wbr/>DeviceAPI</span><a href="#getDeviceAPI" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -232,7 +232,7 @@ via createComputePipelineAsync</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Function</span></h4><p>The corresponding device api.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L734">webgpu.ts:734</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L734">webgpu.ts:734</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="runtimeStatsText" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>runtime<wbr/>Stats<wbr/>Text</span><a href="#runtimeStatsText" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -242,7 +242,7 @@ via createComputePipelineAsync</p>
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">string</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L400">webgpu.ts:400</a></li></ul></aside></li></ul></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L400">webgpu.ts:400</a></li></ul></aside></li></ul></section>
 <section class="tsd-panel tsd-member"><a id="sync" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>sync</span><a href="#sync" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <ul class="tsd-signatures">
@@ -252,7 +252,7 @@ via createComputePipelineAsync</p>
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Promise</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">void</span><span class="tsd-signature-symbol">&gt;</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L393">webgpu.ts:393</a></li></ul></aside></li></ul></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L393">webgpu.ts:393</a></li></ul></aside></li></ul></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/enums/_internal_.RPCServerState.html b/docs/reference/api/typedoc/enums/_internal_.RPCServerState.html
index da79b7b179..039ad9517f 100644
--- a/docs/reference/api/typedoc/enums/_internal_.RPCServerState.html
+++ b/docs/reference/api/typedoc/enums/_internal_.RPCServerState.html
@@ -17,7 +17,7 @@
 <li><a href="_internal_.RPCServerState.html">RPCServerState</a></li></ul>
 <h1>Enumeration RPCServerState</h1></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L27">rpc_server.ts:27</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L27">rpc_server.ts:27</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -38,32 +38,32 @@
 <h3 class="tsd-anchor-link"><span>Init<wbr/>Header</span><a href="#InitHeader" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><path stroke="none" d="M0 0h24v24H0z" fill="none" id="icon-anchor-a"></path><path d="M10 14a3.5 3.5 0 0 0 5 0l4 -4a3.5 3.5 0 0 0 -5 -5l-.5 .5" id="icon-anchor-b"></path><path d="M14 10a3.5 3.5 0 0 0 -5 [...]
 <div class="tsd-signature"><span class="tsd-kind-enum-member">Init<wbr/>Header</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">0</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L28">rpc_server.ts:28</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L28">rpc_server.ts:28</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="InitHeaderKey" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>Init<wbr/>Header<wbr/>Key</span><a href="#InitHeaderKey" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-enum-member">Init<wbr/>Header<wbr/>Key</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">1</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L29">rpc_server.ts:29</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L29">rpc_server.ts:29</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="InitServer" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>Init<wbr/>Server</span><a href="#InitServer" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-enum-member">Init<wbr/>Server</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">2</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L30">rpc_server.ts:30</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L30">rpc_server.ts:30</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="ReceivePacketBody" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>Receive<wbr/>Packet<wbr/>Body</span><a href="#ReceivePacketBody" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-enum-member">Receive<wbr/>Packet<wbr/>Body</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">5</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L33">rpc_server.ts:33</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L33">rpc_server.ts:33</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="ReceivePacketHeader" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>Receive<wbr/>Packet<wbr/>Header</span><a href="#ReceivePacketHeader" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-enum-member">Receive<wbr/>Packet<wbr/>Header</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">4</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L32">rpc_server.ts:32</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L32">rpc_server.ts:32</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="WaitForCallback" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>Wait<wbr/>For<wbr/>Callback</span><a href="#WaitForCallback" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-enum-member">Wait<wbr/>For<wbr/>Callback</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">3</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/rpc_server.ts#L31">rpc_server.ts:31</a></li></ul></aside></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/rpc_server.ts#L31">rpc_server.ts:31</a></li></ul></aside></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/functions/assert.html b/docs/reference/api/typedoc/functions/assert.html
index ba60dd8483..1a719a1aca 100644
--- a/docs/reference/api/typedoc/functions/assert.html
+++ b/docs/reference/api/typedoc/functions/assert.html
@@ -34,7 +34,7 @@
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-symbol">asserts </span><span class="tsd-kind-parameter">condition</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/support.ts#L52">support.ts:52</a></li></ul></aside></li></ul></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/support.ts#L52">support.ts:52</a></li></ul></aside></li></ul></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/functions/createPolyfillWASI.html b/docs/reference/api/typedoc/functions/createPolyfillWASI.html
index 3388b6926a..6798142238 100644
--- a/docs/reference/api/typedoc/functions/createPolyfillWASI.html
+++ b/docs/reference/api/typedoc/functions/createPolyfillWASI.html
@@ -24,7 +24,7 @@
 <h4 class="tsd-returns-title">Returns <a href="../interfaces/LibraryProvider.html" class="tsd-signature-type tsd-kind-interface">LibraryProvider</a></h4><p>A wasi that can run on broswer or local.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/compact.ts#L55">compact.ts:55</a></li></ul></aside></li></ul></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/compact.ts#L55">compact.ts:55</a></li></ul></aside></li></ul></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/functions/detectGPUDevice.html b/docs/reference/api/typedoc/functions/detectGPUDevice.html
index 4b165346b1..48baab6d86 100644
--- a/docs/reference/api/typedoc/functions/detectGPUDevice.html
+++ b/docs/reference/api/typedoc/functions/detectGPUDevice.html
@@ -23,7 +23,7 @@
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Promise</span><span class="tsd-signature-symbol">&lt;</span><a href="../interfaces/GPUDeviceDetectOutput.html" class="tsd-signature-type tsd-kind-interface">GPUDeviceDetectOutput</a><span class="tsd-signature-symbol"> | </span><span class="tsd-signature-type">undefined</span><span class="tsd-signature-symbol">&gt;</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L36">webgpu.ts:36</a></li></ul></aside></li></ul></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L36">webgpu.ts:36</a></li></ul></aside></li></ul></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/functions/hasNDArrayInCache.html b/docs/reference/api/typedoc/functions/hasNDArrayInCache.html
index 341c12a2c0..d63bb39863 100644
--- a/docs/reference/api/typedoc/functions/hasNDArrayInCache.html
+++ b/docs/reference/api/typedoc/functions/hasNDArrayInCache.html
@@ -28,7 +28,7 @@
 <h5><span class="tsd-kind-parameter">cacheScope</span>: <span class="tsd-signature-type">string</span><span class="tsd-signature-symbol"> = &quot;tvmjs&quot;</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Promise</span><span class="tsd-signature-symbol">&lt;</span><span class="tsd-signature-type">boolean</span><span class="tsd-signature-symbol">&gt;</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L2441">runtime.ts:2441</a></li></ul></aside></li></ul></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L2441">runtime.ts:2441</a></li></ul></aside></li></ul></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/functions/instantiate.html b/docs/reference/api/typedoc/functions/instantiate.html
index 3763c26d3c..d37f82c2e0 100644
--- a/docs/reference/api/typedoc/functions/instantiate.html
+++ b/docs/reference/api/typedoc/functions/instantiate.html
@@ -53,7 +53,7 @@ by passing its generated js Module as the imports.</p>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4></li></ul></li></ul></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type ">Promise</span><span class="tsd-signature-symbol">&lt;</span><a href="../classes/Instance.html" class="tsd-signature-type tsd-kind-class">Instance</a><span class="tsd-signature-symbol">&gt;</span></h4><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L2427">runtime.ts:2427</a></li></ul></aside></li></ul></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L2427">runtime.ts:2427</a></li></ul></aside></li></ul></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/functions/wasmPath.html b/docs/reference/api/typedoc/functions/wasmPath.html
index 0189b67b35..c145f5e55b 100644
--- a/docs/reference/api/typedoc/functions/wasmPath.html
+++ b/docs/reference/api/typedoc/functions/wasmPath.html
@@ -24,7 +24,7 @@
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">string</span></h4><p>The wasm path.</p>
 <aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/support.ts#L62">support.ts:62</a></li></ul></aside></li></ul></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/support.ts#L62">support.ts:62</a></li></ul></aside></li></ul></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/interfaces/Disposable.html b/docs/reference/api/typedoc/interfaces/Disposable.html
index 84cfae49d9..3ee747ece2 100644
--- a/docs/reference/api/typedoc/interfaces/Disposable.html
+++ b/docs/reference/api/typedoc/interfaces/Disposable.html
@@ -37,7 +37,7 @@ which needs to be explicitly disposed.</p>
 <li><a href="../classes/TVMObject.html" class="tsd-signature-type tsd-kind-class">TVMObject</a></li>
 <li><a href="../classes/VirtualMachine.html" class="tsd-signature-type tsd-kind-class">VirtualMachine</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/types.ts#L46">types.ts:46</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/types.ts#L46">types.ts:46</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -65,7 +65,7 @@ only the first call will take effect.</p>
 </div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4></li></ul></li></ul></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/types.ts#L52">types.ts:52</a></li></ul></aside></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/types.ts#L52">types.ts:52</a></li></ul></aside></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/interfaces/GPUDeviceDetectOutput.html b/docs/reference/api/typedoc/interfaces/GPUDeviceDetectOutput.html
index 04cc8c89a9..169a15105a 100644
--- a/docs/reference/api/typedoc/interfaces/GPUDeviceDetectOutput.html
+++ b/docs/reference/api/typedoc/interfaces/GPUDeviceDetectOutput.html
@@ -20,7 +20,7 @@
 <ul class="tsd-hierarchy">
 <li><span class="target">GPUDeviceDetectOutput</span></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L27">webgpu.ts:27</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L27">webgpu.ts:27</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -38,17 +38,17 @@
 <h3 class="tsd-anchor-link"><span>adapter</span><a href="#adapter" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><path stroke="none" d="M0 0h24v24H0z" fill="none" id="icon-anchor-a"></path><path d="M10 14a3.5 3.5 0 0 0 5 0l4 -4a3.5 3.5 0 0 0 -5 -5l-.5 .5" id="icon-anchor-b"></path><path d="M14 10a3.5 3.5 0 0 0 -5 0l-4 4a3.5  [...]
 <div class="tsd-signature"><span class="tsd-kind-property">adapter</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type ">GPUAdapter</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L28">webgpu.ts:28</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L28">webgpu.ts:28</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="adapterInfo" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>adapter<wbr/>Info</span><a href="#adapterInfo" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">adapter<wbr/>Info</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type ">GPUAdapterInfo</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L29">webgpu.ts:29</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L29">webgpu.ts:29</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="device" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>device</span><a href="#device" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">device</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type ">GPUDevice</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L30">webgpu.ts:30</a></li></ul></aside></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L30">webgpu.ts:30</a></li></ul></aside></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/interfaces/InitProgressReport.html b/docs/reference/api/typedoc/interfaces/InitProgressReport.html
index 7ad733c5c7..1bc6b66d5a 100644
--- a/docs/reference/api/typedoc/interfaces/InitProgressReport.html
+++ b/docs/reference/api/typedoc/interfaces/InitProgressReport.html
@@ -20,7 +20,7 @@
 <ul class="tsd-hierarchy">
 <li><span class="target">InitProgressReport</span></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L979">runtime.ts:979</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L979">runtime.ts:979</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -39,22 +39,22 @@
 <h3 class="tsd-anchor-link"><span>cache<wbr/>Only</span><a href="#cacheOnly" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><path stroke="none" d="M0 0h24v24H0z" fill="none" id="icon-anchor-a"></path><path d="M10 14a3.5 3.5 0 0 0 5 0l4 -4a3.5 3.5 0 0 0 -5 -5l-.5 .5" id="icon-anchor-b"></path><path d="M14 10a3.5 3.5 0 0 0 -5 0 [...]
 <div class="tsd-signature"><span class="tsd-kind-property">cache<wbr/>Only</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">boolean</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L982">runtime.ts:982</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L982">runtime.ts:982</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="progress" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>progress</span><a href="#progress" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">progress</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L980">runtime.ts:980</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L980">runtime.ts:980</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="text" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>text</span><a href="#text" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">text</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">string</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L983">runtime.ts:983</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L983">runtime.ts:983</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="timeElapsed" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>time<wbr/>Elapsed</span><a href="#timeElapsed" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">time<wbr/>Elapsed</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L981">runtime.ts:981</a></li></ul></aside></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L981">runtime.ts:981</a></li></ul></aside></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/interfaces/LibraryProvider.html b/docs/reference/api/typedoc/interfaces/LibraryProvider.html
index 8066550ec6..5e01473def 100644
--- a/docs/reference/api/typedoc/interfaces/LibraryProvider.html
+++ b/docs/reference/api/typedoc/interfaces/LibraryProvider.html
@@ -32,7 +32,7 @@ to allow the library provider to initialize related resources during startup tim
 <ul class="tsd-hierarchy">
 <li><a href="../classes/_internal_.Environment.html" class="tsd-signature-type tsd-kind-class">Environment</a></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/types.ts#L32">types.ts:32</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/types.ts#L32">types.ts:32</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -51,7 +51,7 @@ to allow the library provider to initialize related resources during startup tim
 <div class="tsd-comment tsd-typography"><p>The imports that can be passed to WebAssembly instance creation.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/types.ts#L34">types.ts:34</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/types.ts#L34">types.ts:34</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="start" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>start</span><a href="#start" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">start</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-symbol">(</span><span class="tsd-signature-symbol">(</span><span class="tsd-kind-parameter">inst</span><span class="tsd-signature-symbol">)</span><span class="tsd-signature-symbol"> =&gt; </span><span class="tsd-signature-type">void</span><span class="tsd-signature-symbol">)</span></div>
@@ -73,7 +73,7 @@ to allow the library provider to initialize related resources during startup tim
 </div></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4></li></ul></li></ul></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/types.ts#L39">types.ts:39</a></li></ul></aside></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/types.ts#L39">types.ts:39</a></li></ul></aside></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/interfaces/_internal_.FunctionInfo.html b/docs/reference/api/typedoc/interfaces/_internal_.FunctionInfo.html
index 0216962490..bf4d7dbb09 100644
--- a/docs/reference/api/typedoc/interfaces/_internal_.FunctionInfo.html
+++ b/docs/reference/api/typedoc/interfaces/_internal_.FunctionInfo.html
@@ -24,7 +24,7 @@
 <ul class="tsd-hierarchy">
 <li><span class="target">FunctionInfo</span></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L335">webgpu.ts:335</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L335">webgpu.ts:335</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -42,17 +42,17 @@
 <h3 class="tsd-anchor-link"><span>arg_<wbr/>types</span><a href="#arg_types" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><path stroke="none" d="M0 0h24v24H0z" fill="none" id="icon-anchor-a"></path><path d="M10 14a3.5 3.5 0 0 0 5 0l4 -4a3.5 3.5 0 0 0 -5 -5l-.5 .5" id="icon-anchor-b"></path><path d="M14 10a3.5 3.5 0 0 0 -5 0 [...]
 <div class="tsd-signature"><span class="tsd-kind-property">arg_<wbr/>types</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">string</span><span class="tsd-signature-symbol">[]</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L337">webgpu.ts:337</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L337">webgpu.ts:337</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="launch_param_tags" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>launch_<wbr/>param_<wbr/>tags</span><a href="#launch_param_tags" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">launch_<wbr/>param_<wbr/>tags</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">string</span><span class="tsd-signature-symbol">[]</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L338">webgpu.ts:338</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L338">webgpu.ts:338</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="name" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>name</span><a href="#name" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">name</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">string</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/webgpu.ts#L336">webgpu.ts:336</a></li></ul></aside></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/webgpu.ts#L336">webgpu.ts:336</a></li></ul></aside></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/interfaces/_internal_.NDArrayCacheEntry.html b/docs/reference/api/typedoc/interfaces/_internal_.NDArrayCacheEntry.html
index 559c0d8990..64493f6b2f 100644
--- a/docs/reference/api/typedoc/interfaces/_internal_.NDArrayCacheEntry.html
+++ b/docs/reference/api/typedoc/interfaces/_internal_.NDArrayCacheEntry.html
@@ -21,7 +21,7 @@
 <ul class="tsd-hierarchy">
 <li><span class="target">NDArrayCacheEntry</span></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L963">runtime.ts:963</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L963">runtime.ts:963</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -42,32 +42,32 @@
 <h3 class="tsd-anchor-link"><span>byte<wbr/>Offset</span><a href="#byteOffset" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><path stroke="none" d="M0 0h24v24H0z" fill="none" id="icon-anchor-a"></path><path d="M10 14a3.5 3.5 0 0 0 5 0l4 -4a3.5 3.5 0 0 0 -5 -5l-.5 .5" id="icon-anchor-b"></path><path d="M14 10a3.5 3.5 0 0 0 -5 [...]
 <div class="tsd-signature"><span class="tsd-kind-property">byte<wbr/>Offset</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L968">runtime.ts:968</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L968">runtime.ts:968</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="dtype" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>dtype</span><a href="#dtype" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">dtype</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">string</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L966">runtime.ts:966</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L966">runtime.ts:966</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="format" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>format</span><a href="#format" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">format</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">&quot;f32-to-bf16&quot;</span><span class="tsd-signature-symbol"> | </span><span class="tsd-signature-type">&quot;raw&quot;</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L967">runtime.ts:967</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L967">runtime.ts:967</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="name" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>name</span><a href="#name" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">name</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">string</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L964">runtime.ts:964</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L964">runtime.ts:964</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="nbytes" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>nbytes</span><a href="#nbytes" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">nbytes</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L969">runtime.ts:969</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L969">runtime.ts:969</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="shape" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>shape</span><a href="#shape" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">shape</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span><span class="tsd-signature-symbol">[]</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L965">runtime.ts:965</a></li></ul></aside></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L965">runtime.ts:965</a></li></ul></aside></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/interfaces/_internal_.NDArrayShardEntry.html b/docs/reference/api/typedoc/interfaces/_internal_.NDArrayShardEntry.html
index c5a407758d..04c455f602 100644
--- a/docs/reference/api/typedoc/interfaces/_internal_.NDArrayShardEntry.html
+++ b/docs/reference/api/typedoc/interfaces/_internal_.NDArrayShardEntry.html
@@ -21,7 +21,7 @@
 <ul class="tsd-hierarchy">
 <li><span class="target">NDArrayShardEntry</span></li></ul></section><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L972">runtime.ts:972</a></li></ul></aside>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L972">runtime.ts:972</a></li></ul></aside>
 <section class="tsd-panel-group tsd-index-group">
 <section class="tsd-panel tsd-index-panel">
 <details class="tsd-index-content tsd-index-accordion" open><summary class="tsd-accordion-summary tsd-index-summary">
@@ -40,22 +40,22 @@
 <h3 class="tsd-anchor-link"><span>data<wbr/>Path</span><a href="#dataPath" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><path stroke="none" d="M0 0h24v24H0z" fill="none" id="icon-anchor-a"></path><path d="M10 14a3.5 3.5 0 0 0 5 0l4 -4a3.5 3.5 0 0 0 -5 -5l-.5 .5" id="icon-anchor-b"></path><path d="M14 10a3.5 3.5 0 0 0 -5 0l- [...]
 <div class="tsd-signature"><span class="tsd-kind-property">data<wbr/>Path</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">string</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L973">runtime.ts:973</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L973">runtime.ts:973</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="format" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>format</span><a href="#format" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">format</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">&quot;raw-shard&quot;</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L974">runtime.ts:974</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L974">runtime.ts:974</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="nbytes" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>nbytes</span><a href="#nbytes" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">nbytes</span><span class="tsd-signature-symbol">:</span> <span class="tsd-signature-type">number</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L975">runtime.ts:975</a></li></ul></aside></section>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L975">runtime.ts:975</a></li></ul></aside></section>
 <section class="tsd-panel tsd-member"><a id="records" class="tsd-anchor"></a>
 <h3 class="tsd-anchor-link"><span>records</span><a href="#records" aria-label="Permalink" class="tsd-anchor-icon"><svg class="icon icon-tabler icon-tabler-link" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" fill="none" stroke-linecap="round" stroke-linejoin="round"><use href="#icon-anchor-a"></use><use href="#icon-anchor-b"></use><use href="#icon-anchor-c"></use></svg></a></h3>
 <div class="tsd-signature"><span class="tsd-kind-property">records</span><span class="tsd-signature-symbol">:</span> <a href="_internal_.NDArrayCacheEntry.html" class="tsd-signature-type tsd-kind-interface">NDArrayCacheEntry</a><span class="tsd-signature-symbol">[]</span></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L976">runtime.ts:976</a></li></ul></aside></section></section></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L976">runtime.ts:976</a></li></ul></aside></section></section></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/types/InitProgressCallback.html b/docs/reference/api/typedoc/types/InitProgressCallback.html
index c7068ee0cd..af429e02cb 100644
--- a/docs/reference/api/typedoc/types/InitProgressCallback.html
+++ b/docs/reference/api/typedoc/types/InitProgressCallback.html
@@ -30,7 +30,7 @@
 <h5><span class="tsd-kind-parameter">report</span>: <a href="../interfaces/InitProgressReport.html" class="tsd-signature-type tsd-kind-interface">InitProgressReport</a></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4></li></ul></li></ul></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L986">runtime.ts:986</a></li></ul></aside></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L986">runtime.ts:986</a></li></ul></aside></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/types/PackedFunc.html b/docs/reference/api/typedoc/types/PackedFunc.html
index 3bb40ab98e..990ca17372 100644
--- a/docs/reference/api/typedoc/types/PackedFunc.html
+++ b/docs/reference/api/typedoc/types/PackedFunc.html
@@ -19,7 +19,7 @@
 <div class="tsd-comment tsd-typography"><p>Type for PackedFunc inthe TVMRuntime.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L36">runtime.ts:36</a></li></ul></aside></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L36">runtime.ts:36</a></li></ul></aside></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/types/_internal_.FObjectConstructor.html b/docs/reference/api/typedoc/types/_internal_.FObjectConstructor.html
index 5df675efa4..1c1b39b2ec 100644
--- a/docs/reference/api/typedoc/types/_internal_.FObjectConstructor.html
+++ b/docs/reference/api/typedoc/types/_internal_.FObjectConstructor.html
@@ -37,7 +37,7 @@
 <h5><span class="tsd-kind-parameter">ctx</span>: <a href="../classes/_internal_.RuntimeContext.html" class="tsd-signature-type tsd-kind-class">RuntimeContext</a></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="../classes/TVMObject.html" class="tsd-signature-type tsd-kind-class">TVMObject</a></h4></li></ul></li></ul></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L857">runtime.ts:857</a></li></ul></aside></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L857">runtime.ts:857</a></li></ul></aside></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/types/_internal_.FTVMWasmAllocSpace.html b/docs/reference/api/typedoc/types/_internal_.FTVMWasmAllocSpace.html
index bc5a5eedf5..9ae5cbcf95 100644
--- a/docs/reference/api/typedoc/types/_internal_.FTVMWasmAllocSpace.html
+++ b/docs/reference/api/typedoc/types/_internal_.FTVMWasmAllocSpace.html
@@ -33,7 +33,7 @@
 <h5><span class="tsd-kind-parameter">size</span>: <span class="tsd-signature-type">number</span></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <a href="_internal_.Pointer.html" class="tsd-signature-type tsd-kind-type-alias">Pointer</a></h4></li></ul></li></ul></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/ctypes.ts#L194">ctypes.ts:194</a></li></ul></aside></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/ctypes.ts#L194">ctypes.ts:194</a></li></ul></aside></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/types/_internal_.FTVMWasmFreeSpace.html b/docs/reference/api/typedoc/types/_internal_.FTVMWasmFreeSpace.html
index f446c13fa0..1dc3f4eecf 100644
--- a/docs/reference/api/typedoc/types/_internal_.FTVMWasmFreeSpace.html
+++ b/docs/reference/api/typedoc/types/_internal_.FTVMWasmFreeSpace.html
@@ -33,7 +33,7 @@
 <h5><span class="tsd-kind-parameter">ptr</span>: <a href="_internal_.Pointer.html" class="tsd-signature-type tsd-kind-type-alias">Pointer</a></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">void</span></h4></li></ul></li></ul></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/ctypes.ts#L197">ctypes.ts:197</a></li></ul></aside></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/ctypes.ts#L197">ctypes.ts:197</a></li></ul></aside></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/types/_internal_.FTVMWasmPackedCFunc.html b/docs/reference/api/typedoc/types/_internal_.FTVMWasmPackedCFunc.html
index 0ce8e8dfcb..d400247408 100644
--- a/docs/reference/api/typedoc/types/_internal_.FTVMWasmPackedCFunc.html
+++ b/docs/reference/api/typedoc/types/_internal_.FTVMWasmPackedCFunc.html
@@ -45,7 +45,7 @@
 <h5><span class="tsd-kind-parameter">resourceHandle</span>: <a href="_internal_.Pointer.html" class="tsd-signature-type tsd-kind-type-alias">Pointer</a></h5></li></ul></div>
 <h4 class="tsd-returns-title">Returns <span class="tsd-signature-type">number</span></h4></li></ul></li></ul></div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/ctypes.ts#L206">ctypes.ts:206</a></li></ul></aside></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/ctypes.ts#L206">ctypes.ts:206</a></li></ul></aside></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/types/_internal_.Pointer.html b/docs/reference/api/typedoc/types/_internal_.Pointer.html
index 7640f3fad9..a9a8b314a4 100644
--- a/docs/reference/api/typedoc/types/_internal_.Pointer.html
+++ b/docs/reference/api/typedoc/types/_internal_.Pointer.html
@@ -20,7 +20,7 @@
 <div class="tsd-comment tsd-typography"><p>A pointer to points to the raw address space.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/ctypes.ts#L25">ctypes.ts:25</a></li></ul></aside></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/ctypes.ts#L25">ctypes.ts:25</a></li></ul></aside></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/reference/api/typedoc/types/_internal_.TVMObjectBase.html b/docs/reference/api/typedoc/types/_internal_.TVMObjectBase.html
index 497b4fcce2..6f36b23bc0 100644
--- a/docs/reference/api/typedoc/types/_internal_.TVMObjectBase.html
+++ b/docs/reference/api/typedoc/types/_internal_.TVMObjectBase.html
@@ -20,7 +20,7 @@
 <div class="tsd-comment tsd-typography"><p>All possible object types.</p>
 </div><aside class="tsd-sources">
 <ul>
-<li>Defined in <a href="https://github.com/apache/tvm/blob/efc2ae984/web/src/runtime.ts#L860">runtime.ts:860</a></li></ul></aside></div>
+<li>Defined in <a href="https://github.com/apache/tvm/blob/5645c52c6/web/src/runtime.ts#L860">runtime.ts:860</a></li></ul></aside></div>
 <div class="col-sidebar">
 <div class="page-menu">
 <div class="tsd-navigation settings">
diff --git a/docs/searchindex.js b/docs/searchindex.js
index 07f4b4f140..c44b1c12bb 100644
--- a/docs/searchindex.js
+++ b/docs/searchindex.js
@@ -1 +1 @@
-Search.setIndex({docnames:["arch/benchmark","arch/convert_layout","arch/debugger","arch/device_target_interactions","arch/frontend/tensorflow","arch/hybrid_script","arch/index","arch/inferbound","arch/introduction_to_module_serialization","arch/microtvm_design","arch/microtvm_project_api","arch/model_library_format","arch/pass_infra","arch/relay_intro","arch/relay_op_strategy","arch/runtime","arch/runtimes/vulkan","arch/security","arch/virtual_machine","contribute/ci","contribute/code_gu [...]
\ No newline at end of file
+Search.setIndex({docnames:["arch/benchmark","arch/convert_layout","arch/debugger","arch/device_target_interactions","arch/frontend/tensorflow","arch/hybrid_script","arch/index","arch/inferbound","arch/introduction_to_module_serialization","arch/microtvm_design","arch/microtvm_project_api","arch/model_library_format","arch/pass_infra","arch/relay_intro","arch/relay_op_strategy","arch/runtime","arch/runtimes/vulkan","arch/security","arch/virtual_machine","contribute/ci","contribute/code_gu [...]
\ No newline at end of file
diff --git a/docs/topic/vta/tutorials/frontend/deploy_detection.html b/docs/topic/vta/tutorials/frontend/deploy_detection.html
index 57825e4057..f56eb3831f 100644
--- a/docs/topic/vta/tutorials/frontend/deploy_detection.html
+++ b/docs/topic/vta/tutorials/frontend/deploy_detection.html
@@ -619,7 +619,7 @@ and dense layer which will both be executed in fp32 on the CPU.</p></li>
 </div>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>/workspace/python/tvm/relay/build_module.py:345: DeprecationWarning: Please use input parameter mod (tvm.IRModule) instead of deprecated parameter mod (tvm.relay.function.Function)
   warnings.warn(
-yolov3-tiny inference graph built in 25.97s!
+yolov3-tiny inference graph built in 24.99s!
 </pre></div>
 </div>
 </div>
diff --git a/docs/topic/vta/tutorials/frontend/sg_execution_times.html b/docs/topic/vta/tutorials/frontend/sg_execution_times.html
index 229d8ef685..2d69bc1d2a 100644
--- a/docs/topic/vta/tutorials/frontend/sg_execution_times.html
+++ b/docs/topic/vta/tutorials/frontend/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-topic-vta-tutorials-frontend-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>00:58.720</strong> total execution time for <strong>topic_vta_tutorials_frontend</strong> files:</p>
+<p><strong>00:57.499</strong> total execution time for <strong>topic_vta_tutorials_frontend</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 83%" />
@@ -369,7 +369,7 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="deploy_detection.html#sphx-glr-topic-vta-tutorials-frontend-deploy-detection-py"><span class="std std-ref">Deploy Pretrained Vision Detection Model from Darknet on VTA</span></a> (<code class="docutils literal notranslate"><span class="pre">deploy_detection.py</span></code>)</p></td>
-<td><p>00:58.720</p></td>
+<td><p>00:57.499</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 </tbody>
diff --git a/docs/topic/vta/tutorials/optimize/sg_execution_times.html b/docs/topic/vta/tutorials/optimize/sg_execution_times.html
index 8214f6eca6..a082d1b666 100644
--- a/docs/topic/vta/tutorials/optimize/sg_execution_times.html
+++ b/docs/topic/vta/tutorials/optimize/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-topic-vta-tutorials-optimize-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>00:03.215</strong> total execution time for <strong>topic_vta_tutorials_optimize</strong> files:</p>
+<p><strong>00:03.224</strong> total execution time for <strong>topic_vta_tutorials_optimize</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 84%" />
@@ -369,11 +369,11 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="convolution_opt.html#sphx-glr-topic-vta-tutorials-optimize-convolution-opt-py"><span class="std std-ref">2D Convolution Optimization</span></a> (<code class="docutils literal notranslate"><span class="pre">convolution_opt.py</span></code>)</p></td>
-<td><p>00:02.722</p></td>
+<td><p>00:02.734</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="matrix_multiply_opt.html#sphx-glr-topic-vta-tutorials-optimize-matrix-multiply-opt-py"><span class="std std-ref">Matrix Multiply Blocking</span></a> (<code class="docutils literal notranslate"><span class="pre">matrix_multiply_opt.py</span></code>)</p></td>
-<td><p>00:00.493</p></td>
+<td><p>00:00.490</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 </tbody>
diff --git a/docs/topic/vta/tutorials/sg_execution_times.html b/docs/topic/vta/tutorials/sg_execution_times.html
index adbc657961..55322713b6 100644
--- a/docs/topic/vta/tutorials/sg_execution_times.html
+++ b/docs/topic/vta/tutorials/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-topic-vta-tutorials-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>00:00.814</strong> total execution time for <strong>topic_vta_tutorials</strong> files:</p>
+<p><strong>00:00.809</strong> total execution time for <strong>topic_vta_tutorials</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 81%" />
@@ -369,11 +369,11 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="matrix_multiply.html#sphx-glr-topic-vta-tutorials-matrix-multiply-py"><span class="std std-ref">Simple Matrix Multiply</span></a> (<code class="docutils literal notranslate"><span class="pre">matrix_multiply.py</span></code>)</p></td>
-<td><p>00:00.421</p></td>
+<td><p>00:00.419</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="vta_get_started.html#sphx-glr-topic-vta-tutorials-vta-get-started-py"><span class="std std-ref">Get Started with VTA</span></a> (<code class="docutils literal notranslate"><span class="pre">vta_get_started.py</span></code>)</p></td>
-<td><p>00:00.393</p></td>
+<td><p>00:00.391</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 </tbody>
diff --git a/docs/tutorial/auto_scheduler_matmul_x86.html b/docs/tutorial/auto_scheduler_matmul_x86.html
index c3b3136beb..7004daba94 100644
--- a/docs/tutorial/auto_scheduler_matmul_x86.html
+++ b/docs/tutorial/auto_scheduler_matmul_x86.html
@@ -512,9 +512,6 @@ trials, we can load the best schedule from the log file and apply it.</p>
 <a href="../reference/api/python/te.html#tvm.te.Schedule" title="tvm.te.Schedule" class="sphx-glr-backref-module-tvm-te sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class="n">sch</span></a><span class="p">,</span> <a href="../reference/api/python/ir.html#tvm.ir.Array" title="tvm.ir.Array" class="sphx-glr-backref-module-tvm-ir sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class="n">args</span></a> <span class="o">=</span> <a href="../reference/api/pyth [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>*E
-</pre></div>
-</div>
 </div>
 <div class="section" id="inspecting-the-optimized-schedule">
 <h2>Inspecting the Optimized Schedule<a class="headerlink" href="#inspecting-the-optimized-schedule" title="Permalink to this headline">¶</a></h2>
@@ -592,7 +589,7 @@ class Module:
 <span class="p">)</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Execution time of this operator: 94.584 ms
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Execution time of this operator: 91.374 ms
 </pre></div>
 </div>
 </div>
@@ -664,7 +661,7 @@ automatically optimize a matrix multiplication, without the need to specify a
 search template.  It ends a series of examples that starts from the Tensor
 Expression (TE) language that demonstrates how TVM can optimize computational
 operations.</p>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  31.186 seconds)</p>
+<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  30.520 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-tutorial-auto-scheduler-matmul-x86-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../_downloads/eac4389b114db015e95cb3cdf8b86b83/auto_scheduler_matmul_x86.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">auto_scheduler_matmul_x86.py</span></code></a></p>
diff --git a/docs/tutorial/autotvm_matmul_x86.html b/docs/tutorial/autotvm_matmul_x86.html
index 6b8d66c4b1..98b64065c7 100644
--- a/docs/tutorial/autotvm_matmul_x86.html
+++ b/docs/tutorial/autotvm_matmul_x86.html
@@ -700,16 +700,16 @@ reduce variance, we take 5 measurements and average them.</p>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>waiting for device...
 device available
 Get devices for measurement successfully!
-No: 1   GFLOPS: 1.51/1.51       result: MeasureResult(costs=(0.17835317460000003,), error_no=MeasureErrorNo.NO_ERROR, all_cost=3.10009503364563, timestamp=1708112752.993949)   [(&#39;tile_y&#39;, [-1, 1]), (&#39;tile_x&#39;, [-1, 1])],None,0
-No: 2   GFLOPS: 11.34/11.34     result: MeasureResult(costs=(0.0236654322,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6512999534606934, timestamp=1708112753.635269)        [(&#39;tile_y&#39;, [-1, 128]), (&#39;tile_x&#39;, [-1, 32])],None,57
-No: 3   GFLOPS: 3.86/11.34      result: MeasureResult(costs=(0.0694950984,), error_no=MeasureErrorNo.NO_ERROR, all_cost=1.3890049457550049, timestamp=1708112755.0241795)       [(&#39;tile_y&#39;, [-1, 4]), (&#39;tile_x&#39;, [-1, 2])],None,12
-No: 4   GFLOPS: 12.59/12.59     result: MeasureResult(costs=(0.021325845599999997,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6152889728546143, timestamp=1708112755.6300626)       [(&#39;tile_y&#39;, [-1, 32]), (&#39;tile_x&#39;, [-1, 128])],None,75
-No: 5   GFLOPS: 10.53/12.59     result: MeasureResult(costs=(0.0255028956,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.724308967590332, timestamp=1708112756.465089) [(&#39;tile_y&#39;, [-1, 1]), (&#39;tile_x&#39;, [-1, 64])],None,60
-No: 6   GFLOPS: 3.65/12.59      result: MeasureResult(costs=(0.0735273582,), error_no=MeasureErrorNo.NO_ERROR, all_cost=1.4519906044006348, timestamp=1708112757.901221)        [(&#39;tile_y&#39;, [-1, 64]), (&#39;tile_x&#39;, [-1, 8])],None,36
-No: 7   GFLOPS: 7.74/12.59      result: MeasureResult(costs=(0.0346974656,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.8124942779541016, timestamp=1708112758.7230752)       [(&#39;tile_y&#39;, [-1, 512]), (&#39;tile_x&#39;, [-1, 32])],None,59
-No: 8   GFLOPS: 5.78/12.59      result: MeasureResult(costs=(0.046433825,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.9909617900848389, timestamp=1708112759.725885) [(&#39;tile_y&#39;, [-1, 1]), (&#39;tile_x&#39;, [-1, 4])],None,20
-No: 9   GFLOPS: 11.01/12.59     result: MeasureResult(costs=(0.0243856706,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6288285255432129, timestamp=1708112760.4636207)       [(&#39;tile_y&#39;, [-1, 64]), (&#39;tile_x&#39;, [-1, 256])],None,86
-No: 10  GFLOPS: 2.02/12.59      result: MeasureResult(costs=(0.133171552,), error_no=MeasureErrorNo.NO_ERROR, all_cost=2.3537261486053467, timestamp=1708112762.8518498)        [(&#39;tile_y&#39;, [-1, 64]), (&#39;tile_x&#39;, [-1, 4])],None,26
+No: 1   GFLOPS: 9.13/9.13       result: MeasureResult(costs=(0.0293897064,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.71429443359375, timestamp=1708122324.5400858) [(&#39;tile_y&#39;, [-1, 1]), (&#39;tile_x&#39;, [-1, 32])],None,50
+No: 2   GFLOPS: 11.23/11.23     result: MeasureResult(costs=(0.0239100742,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.7238938808441162, timestamp=1708122325.1927767)       [(&#39;tile_y&#39;, [-1, 1]), (&#39;tile_x&#39;, [-1, 128])],None,70
+No: 3   GFLOPS: 13.73/13.73     result: MeasureResult(costs=(0.0195579744,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6750574111938477, timestamp=1708122325.7714353)       [(&#39;tile_y&#39;, [-1, 256]), (&#39;tile_x&#39;, [-1, 128])],None,78
+No: 4   GFLOPS: 13.77/13.77     result: MeasureResult(costs=(0.0195008208,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.5940024852752686, timestamp=1708122326.3449867)       [(&#39;tile_y&#39;, [-1, 8]), (&#39;tile_x&#39;, [-1, 256])],None,83
+No: 5   GFLOPS: 10.60/13.77     result: MeasureResult(costs=(0.025329714000000003,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6626152992248535, timestamp=1708122327.1872604)       [(&#39;tile_y&#39;, [-1, 4]), (&#39;tile_x&#39;, [-1, 32])],None,52
+No: 6   GFLOPS: 12.22/13.77     result: MeasureResult(costs=(0.0219690712,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6191966533660889, timestamp=1708122327.7918754)       [(&#39;tile_y&#39;, [-1, 16]), (&#39;tile_x&#39;, [-1, 16])],None,44
+No: 7   GFLOPS: 9.28/13.77      result: MeasureResult(costs=(0.0289150848,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.724522590637207, timestamp=1708122328.505882) [(&#39;tile_y&#39;, [-1, 256]), (&#39;tile_x&#39;, [-1, 16])],None,48
+No: 8   GFLOPS: 9.59/13.77      result: MeasureResult(costs=(0.028001969200000004,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.7735788822174072, timestamp=1708122329.2077875)       [(&#39;tile_y&#39;, [-1, 512]), (&#39;tile_x&#39;, [-1, 64])],None,69
+No: 9   GFLOPS: 13.80/13.80     result: MeasureResult(costs=(0.0194582234,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.5858185291290283, timestamp=1708122329.90334) [(&#39;tile_y&#39;, [-1, 128]), (&#39;tile_x&#39;, [-1, 128])],None,77
+No: 10  GFLOPS: 9.96/13.80      result: MeasureResult(costs=(0.026938249000000004,), error_no=MeasureErrorNo.NO_ERROR, all_cost=0.6510946750640869, timestamp=1708122330.585345)        [(&#39;tile_y&#39;, [-1, 4]), (&#39;tile_x&#39;, [-1, 16])],None,42
 </pre></div>
 </div>
 <p>With tuning completed, we can choose the configuration from the log file that
diff --git a/docs/tutorial/autotvm_relay_x86.html b/docs/tutorial/autotvm_relay_x86.html
index dead644662..4e37663e31 100644
--- a/docs/tutorial/autotvm_relay_x86.html
+++ b/docs/tutorial/autotvm_relay_x86.html
@@ -578,7 +578,7 @@ standard deviation.</p>
 <span class="nb">print</span><span class="p">(</span><a href="https://docs.python.org/3/library/stdtypes.html#dict" title="builtins.dict" class="sphx-glr-backref-module-builtins sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class="n">unoptimized</span></a><span class="p">)</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>{&#39;mean&#39;: 492.71140236000065, &#39;median&#39;: 492.71483809998244, &#39;std&#39;: 3.6515681597064877}
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>{&#39;mean&#39;: 466.5144750900072, &#39;median&#39;: 466.5101404500092, &#39;std&#39;: 1.2822160389899164}
 </pre></div>
 </div>
 </div>
@@ -767,178 +767,179 @@ depending on the specifics of the model and the target platform.</p>
 </pre></div>
 </div>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>[Task  1/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task  1/25]  Current/Best:   10.92/  23.17 GFLOPS | Progress: (4/20) | 9.02 s
-[Task  1/25]  Current/Best:    9.70/  23.17 GFLOPS | Progress: (8/20) | 11.45 s
-[Task  1/25]  Current/Best:   12.83/  23.17 GFLOPS | Progress: (12/20) | 14.44 s
-[Task  1/25]  Current/Best:    3.33/  23.58 GFLOPS | Progress: (16/20) | 17.31 s
-[Task  1/25]  Current/Best:   10.67/  23.58 GFLOPS | Progress: (20/20) | 22.19 s Done.
+[Task  1/25]  Current/Best:   13.35/  16.35 GFLOPS | Progress: (4/20) | 9.05 s
+[Task  1/25]  Current/Best:   20.95/  20.95 GFLOPS | Progress: (8/20) | 11.39 s
+[Task  1/25]  Current/Best:   23.80/  23.80 GFLOPS | Progress: (12/20) | 13.37 s
+[Task  1/25]  Current/Best:   18.53/  24.41 GFLOPS | Progress: (16/20) | 16.14 s
+[Task  1/25]  Current/Best:   15.94/  24.41 GFLOPS | Progress: (20/20) | 18.52 s Done.
 
 [Task  2/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task  2/25]  Current/Best:   15.66/  17.36 GFLOPS | Progress: (4/20) | 4.21 s
-[Task  2/25]  Current/Best:    7.19/  17.36 GFLOPS | Progress: (8/20) | 5.92 s
-[Task  2/25]  Current/Best:    3.39/  17.36 GFLOPS | Progress: (12/20) | 7.81 s
-[Task  2/25]  Current/Best:    7.31/  17.36 GFLOPS | Progress: (16/20) | 9.32 s
-[Task  2/25]  Current/Best:   17.71/  17.71 GFLOPS | Progress: (20/20) | 10.90 s Done.
+[Task  2/25]  Current/Best:   18.96/  18.96 GFLOPS | Progress: (4/20) | 3.85 s
+[Task  2/25]  Current/Best:    9.18/  22.58 GFLOPS | Progress: (8/20) | 5.38 s
+[Task  2/25]  Current/Best:    6.46/  22.58 GFLOPS | Progress: (12/20) | 7.03 s
+[Task  2/25]  Current/Best:    3.51/  22.58 GFLOPS | Progress: (16/20) | 8.59 s
+[Task  2/25]  Current/Best:    7.97/  22.58 GFLOPS | Progress: (20/20) | 10.27 s Done.
 
 [Task  3/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task  3/25]  Current/Best:    1.62/  19.44 GFLOPS | Progress: (4/20) | 7.61 s
-[Task  3/25]  Current/Best:   19.37/  24.26 GFLOPS | Progress: (8/20) | 10.96 s
-[Task  3/25]  Current/Best:   17.35/  24.26 GFLOPS | Progress: (12/20) | 13.65 s
-[Task  3/25]  Current/Best:    6.41/  24.26 GFLOPS | Progress: (16/20) | 16.03 s
-[Task  3/25]  Current/Best:   11.35/  24.26 GFLOPS | Progress: (20/20) | 18.99 s Done.
+[Task  3/25]  Current/Best:    6.44/  20.95 GFLOPS | Progress: (4/20) | 4.64 s
+[Task  3/25]  Current/Best:    6.42/  20.95 GFLOPS | Progress: (8/20) | 7.39 s
+[Task  3/25]  Current/Best:    8.39/  20.95 GFLOPS | Progress: (12/20) | 9.95 s
+[Task  3/25]  Current/Best:    6.65/  20.95 GFLOPS | Progress: (16/20) | 12.31 s
+[Task  3/25]  Current/Best:    8.44/  20.95 GFLOPS | Progress: (20/20) | 14.70 s Done.
 
 [Task  4/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task  4/25]  Current/Best:   15.56/  19.73 GFLOPS | Progress: (4/20) | 4.56 s
-[Task  4/25]  Current/Best:   14.50/  19.73 GFLOPS | Progress: (8/20) | 7.26 s
-[Task  4/25]  Current/Best:   10.23/  19.73 GFLOPS | Progress: (12/20) | 9.42 s
-[Task  4/25]  Current/Best:   14.39/  19.73 GFLOPS | Progress: (16/20) | 13.60 s
-[Task  4/25]  Current/Best:   13.08/  19.73 GFLOPS | Progress: (20/20) | 17.03 s Done.
+[Task  4/25]  Current/Best:   11.00/  20.17 GFLOPS | Progress: (4/20) | 5.23 s
+[Task  4/25]  Current/Best:    7.93/  20.17 GFLOPS | Progress: (8/20) | 8.36 s
+[Task  4/25]  Current/Best:    9.49/  20.83 GFLOPS | Progress: (12/20) | 13.38 s
+[Task  4/25]  Current/Best:    8.70/  20.83 GFLOPS | Progress: (16/20) | 15.21 s
+[Task  4/25]  Current/Best:   13.11/  21.66 GFLOPS | Progress: (20/20) | 22.72 s Done.
 
 [Task  5/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task  5/25]  Current/Best:    8.00/  14.23 GFLOPS | Progress: (4/20) | 4.58 s
-[Task  5/25]  Current/Best:    1.50/  15.63 GFLOPS | Progress: (8/20) | 7.15 s
-[Task  5/25]  Current/Best:   17.90/  17.90 GFLOPS | Progress: (12/20) | 9.19 s
-[Task  5/25]  Current/Best:    3.20/  21.82 GFLOPS | Progress: (16/20) | 12.40 s
-[Task  5/25]  Current/Best:   13.78/  21.82 GFLOPS | Progress: (20/20) | 14.91 s Done.
+[Task  5/25]  Current/Best:    4.49/  18.59 GFLOPS | Progress: (4/20) | 4.66 s
+[Task  5/25]  Current/Best:   11.28/  18.59 GFLOPS | Progress: (8/20) | 6.77 s
+[Task  5/25]  Current/Best:   13.84/  19.55 GFLOPS | Progress: (12/20) | 8.40 s
+[Task  5/25]  Current/Best:   12.94/  19.55 GFLOPS | Progress: (16/20) | 11.22 s
+[Task  5/25]  Current/Best:    6.48/  19.96 GFLOPS | Progress: (20/20) | 13.43 s Done.
 
 [Task  6/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task  6/25]  Current/Best:   11.34/  13.74 GFLOPS | Progress: (4/20) | 5.42 s
-[Task  6/25]  Current/Best:   22.03/  22.03 GFLOPS | Progress: (8/20) | 7.62 s
-[Task  6/25]  Current/Best:   15.98/  22.03 GFLOPS | Progress: (12/20) | 9.84 s
-[Task  6/25]  Current/Best:    2.42/  22.03 GFLOPS | Progress: (16/20) | 12.74 s
-[Task  6/25]  Current/Best:   14.97/  22.03 GFLOPS | Progress: (20/20) | 16.15 s Done.
+[Task  6/25]  Current/Best:   11.12/  15.81 GFLOPS | Progress: (4/20) | 7.41 s
+[Task  6/25]  Current/Best:   16.56/  21.18 GFLOPS | Progress: (8/20) | 10.53 s
+[Task  6/25]  Current/Best:   12.31/  22.07 GFLOPS | Progress: (12/20) | 14.17 s
+[Task  6/25]  Current/Best:   17.64/  22.07 GFLOPS | Progress: (16/20) | 16.15 s
+[Task  6/25]  Current/Best:   23.17/  23.17 GFLOPS | Progress: (20/20) | 18.67 s Done.
 
 [Task  7/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task  7/25]  Current/Best:    3.14/  21.01 GFLOPS | Progress: (4/20) | 5.61 s
-[Task  7/25]  Current/Best:   14.91/  21.01 GFLOPS | Progress: (8/20) | 9.35 s
-[Task  7/25]  Current/Best:   12.45/  21.01 GFLOPS | Progress: (12/20) | 12.35 s
-[Task  7/25]  Current/Best:   20.32/  21.01 GFLOPS | Progress: (16/20) | 15.01 s
-[Task  7/25]  Current/Best:   11.14/  21.01 GFLOPS | Progress: (20/20) | 17.31 s Done.
+[Task  7/25]  Current/Best:   13.84/  16.97 GFLOPS | Progress: (4/20) | 4.61 s
+[Task  7/25]  Current/Best:   12.39/  19.19 GFLOPS | Progress: (8/20) | 8.16 s
+[Task  7/25]  Current/Best:   13.44/  21.82 GFLOPS | Progress: (12/20) | 10.89 s
+[Task  7/25]  Current/Best:   18.91/  21.82 GFLOPS | Progress: (16/20) | 13.07 s
+[Task  7/25]  Current/Best:   10.72/  21.82 GFLOPS | Progress: (20/20) | 17.41 s Done.
 
 [Task  8/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task  8/25]  Current/Best:   14.71/  15.26 GFLOPS | Progress: (4/20) | 4.99 s
-[Task  8/25]  Current/Best:    2.89/  15.26 GFLOPS | Progress: (8/20) | 8.13 s
-[Task  8/25]  Current/Best:   12.13/  17.63 GFLOPS | Progress: (12/20) | 11.48 s
-[Task  8/25]  Current/Best:    9.61/  17.63 GFLOPS | Progress: (16/20) | 17.56 s
-[Task  8/25]  Current/Best:   10.39/  17.63 GFLOPS | Progress: (20/20) | 22.98 s Done.
+[Task  8/25]  Current/Best:    3.63/  18.60 GFLOPS | Progress: (4/20) | 5.00 s
+[Task  8/25]  Current/Best:   11.13/  18.60 GFLOPS | Progress: (8/20) | 8.61 s
+[Task  8/25]  Current/Best:   12.98/  18.60 GFLOPS | Progress: (12/20) | 11.21 s
+[Task  8/25]  Current/Best:   15.96/  18.60 GFLOPS | Progress: (16/20) | 13.40 s
+[Task  8/25]  Current/Best:    3.05/  18.60 GFLOPS | Progress: (20/20) | 16.55 s Done.
 
 [Task  9/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task  9/25]  Current/Best:   11.39/  16.74 GFLOPS | Progress: (4/20) | 4.93 s
-[Task  9/25]  Current/Best:    9.20/  16.74 GFLOPS | Progress: (8/20) | 8.18 s
-[Task  9/25]  Current/Best:   12.51/  16.74 GFLOPS | Progress: (12/20) | 15.73 s
-[Task  9/25]  Current/Best:   18.13/  19.73 GFLOPS | Progress: (16/20) | 18.36 s
-[Task  9/25]  Current/Best:    8.51/  19.73 GFLOPS | Progress: (20/20) | 26.27 s Done.
+[Task  9/25]  Current/Best:   17.15/  18.14 GFLOPS | Progress: (4/20) | 10.28 s
+[Task  9/25]  Current/Best:   15.92/  18.45 GFLOPS | Progress: (8/20) | 12.38 s
+[Task  9/25]  Current/Best:    9.44/  18.45 GFLOPS | Progress: (12/20) | 17.74 s
+[Task  9/25]  Current/Best:   17.44/  18.45 GFLOPS | Progress: (16/20) | 20.16 s
+[Task  9/25]  Current/Best:   22.32/  22.32 GFLOPS | Progress: (20/20) | 22.49 s Done.
 
 [Task 10/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 10/25]  Current/Best:   13.88/  14.79 GFLOPS | Progress: (4/20) | 4.77 s
-[Task 10/25]  Current/Best:   10.11/  14.79 GFLOPS | Progress: (8/20) | 7.36 s
-[Task 10/25]  Current/Best:    3.97/  17.30 GFLOPS | Progress: (12/20) | 9.42 s
-[Task 10/25]  Current/Best:   18.43/  18.43 GFLOPS | Progress: (16/20) | 11.40 s
-[Task 10/25]  Current/Best:   18.30/  18.43 GFLOPS | Progress: (20/20) | 14.24 s Done.
+[Task 10/25]  Current/Best:    9.02/  12.22 GFLOPS | Progress: (4/20) | 4.51 s
+[Task 10/25]  Current/Best:   18.76/  18.76 GFLOPS | Progress: (8/20) | 6.25 s
+[Task 10/25]  Current/Best:    6.26/  20.16 GFLOPS | Progress: (12/20) | 7.87 s
+[Task 10/25]  Current/Best:    3.17/  20.16 GFLOPS | Progress: (16/20) | 10.09 s
+[Task 10/25]  Current/Best:   10.37/  21.58 GFLOPS | Progress: (20/20) | 12.15 s Done.
 
 [Task 11/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 11/25]  Current/Best:   18.33/  18.33 GFLOPS | Progress: (4/20) | 5.54 s
-[Task 11/25]  Current/Best:   20.07/  20.07 GFLOPS | Progress: (8/20) | 7.98 s
-[Task 11/25]  Current/Best:   21.14/  21.14 GFLOPS | Progress: (12/20) | 10.34 s
-[Task 11/25]  Current/Best:   19.04/  21.14 GFLOPS | Progress: (16/20) | 12.96 s
-[Task 11/25]  Current/Best:    6.62/  21.14 GFLOPS | Progress: (20/20) | 15.31 s Done.
+[Task 11/25]  Current/Best:    9.78/  18.87 GFLOPS | Progress: (4/20) | 4.66 s
+[Task 11/25]  Current/Best:   21.41/  21.41 GFLOPS | Progress: (8/20) | 6.89 s
+[Task 11/25]  Current/Best:   21.94/  21.94 GFLOPS | Progress: (12/20) | 9.00 s
+[Task 11/25]  Current/Best:   14.37/  21.94 GFLOPS | Progress: (16/20) | 11.39 s
+[Task 11/25]  Current/Best:    6.46/  21.94 GFLOPS | Progress: (20/20) | 14.40 s Done.
 
 [Task 12/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 12/25]  Current/Best:   15.74/  16.61 GFLOPS | Progress: (4/20) | 4.99 s
-[Task 12/25]  Current/Best:    8.15/  16.61 GFLOPS | Progress: (8/20) | 9.85 s
-[Task 12/25]  Current/Best:   12.66/  16.61 GFLOPS | Progress: (12/20) | 12.90 s
-[Task 12/25]  Current/Best:   18.72/  21.36 GFLOPS | Progress: (16/20) | 16.98 s
-[Task 12/25]  Current/Best:   18.70/  21.36 GFLOPS | Progress: (20/20) | 18.93 s Done.
+[Task 12/25]  Current/Best:   14.53/  15.63 GFLOPS | Progress: (4/20) | 4.86 s
+[Task 12/25]  Current/Best:   14.80/  16.46 GFLOPS | Progress: (8/20) | 7.05 s
+[Task 12/25]  Current/Best:   12.93/  16.46 GFLOPS | Progress: (12/20) | 9.48 s
+[Task 12/25]  Current/Best:   14.84/  21.84 GFLOPS | Progress: (16/20) | 12.08 s
+[Task 12/25]  Current/Best:   11.69/  21.84 GFLOPS | Progress: (20/20) | 14.91 s Done.
 
 [Task 13/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 13/25]  Current/Best:   12.08/  21.32 GFLOPS | Progress: (4/20) | 4.91 s
-[Task 13/25]  Current/Best:   11.79/  21.32 GFLOPS | Progress: (8/20) | 8.05 s
-[Task 13/25]  Current/Best:    7.07/  21.32 GFLOPS | Progress: (12/20) | 10.37 s
-[Task 13/25]  Current/Best:   19.25/  21.32 GFLOPS | Progress: (16/20) | 13.13 s
-[Task 13/25]  Current/Best:   18.85/  21.32 GFLOPS | Progress: (20/20) | 18.32 s Done.
+[Task 13/25]  Current/Best:   13.06/  15.95 GFLOPS | Progress: (4/20) | 6.04 s
+[Task 13/25]  Current/Best:   20.37/  21.57 GFLOPS | Progress: (8/20) | 8.27 s
+[Task 13/25]  Current/Best:   17.56/  21.57 GFLOPS | Progress: (12/20) | 12.57 s
+[Task 13/25]  Current/Best:    9.53/  22.89 GFLOPS | Progress: (16/20) | 16.01 s
+[Task 13/25]  Current/Best:    6.17/  22.89 GFLOPS | Progress: (20/20) | 19.18 s Done.
 
 [Task 14/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 14/25]  Current/Best:   12.44/  12.44 GFLOPS | Progress: (4/20) | 12.70 s
-[Task 14/25]  Current/Best:    1.61/  16.43 GFLOPS | Progress: (8/20) | 20.28 s
-[Task 14/25]  Current/Best:   16.91/  18.81 GFLOPS | Progress: (12/20) | 22.11 s
-[Task 14/25]  Current/Best:    9.50/  18.81 GFLOPS | Progress: (16/20) | 25.52 s
-[Task 14/25]  Current/Best:   19.60/  19.60 GFLOPS | Progress: (20/20) | 33.13 s Done.
+[Task 14/25]  Current/Best:   13.30/  19.36 GFLOPS | Progress: (4/20) | 6.52 s
+[Task 14/25]  Current/Best:    2.43/  20.73 GFLOPS | Progress: (8/20) | 13.17 s
+[Task 14/25]  Current/Best:    3.11/  20.73 GFLOPS | Progress: (12/20) | 16.75 s
+[Task 14/25]  Current/Best:   15.91/  20.73 GFLOPS | Progress: (16/20) | 23.37 s
+[Task 14/25]  Current/Best:    7.95/  21.68 GFLOPS | Progress: (20/20) | 28.15 s Done.
 
 [Task 15/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 15/25]  Current/Best:    7.86/  14.38 GFLOPS | Progress: (4/20) | 7.21 s
-[Task 15/25]  Current/Best:   16.73/  21.04 GFLOPS | Progress: (8/20) | 9.70 s
-[Task 15/25]  Current/Best:   13.61/  21.04 GFLOPS | Progress: (12/20) | 11.57 s
-[Task 15/25]  Current/Best:   22.30/  22.30 GFLOPS | Progress: (16/20) | 13.06 s
-[Task 15/25]  Current/Best:   22.70/  23.29 GFLOPS | Progress: (20/20) | 14.80 s Done.
-
+[Task 15/25]  Current/Best:    7.34/  23.18 GFLOPS | Progress: (4/20) | 7.79 s
+[Task 15/25]  Current/Best:   21.76/  23.18 GFLOPS | Progress: (8/20) | 12.19 s
+[Task 15/25]  Current/Best:   15.81/  23.18 GFLOPS | Progress: (12/20) | 23.20 s
+[Task 15/25]  Current/Best:    6.34/  23.18 GFLOPS | Progress: (16/20) | 31.40 s
+[Task 15/25]  Current/Best:    6.25/  23.18 GFLOPS | Progress: (20/20) | 33.30 s
 [Task 16/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 16/25]  Current/Best:   15.03/  18.73 GFLOPS | Progress: (4/20) | 5.32 s
-[Task 16/25]  Current/Best:   13.19/  18.73 GFLOPS | Progress: (8/20) | 7.39 s
-[Task 16/25]  Current/Best:    9.65/  21.93 GFLOPS | Progress: (12/20) | 9.31 s
-[Task 16/25]  Current/Best:   15.00/  21.93 GFLOPS | Progress: (16/20) | 11.10 s
-[Task 16/25]  Current/Best:   13.77/  21.93 GFLOPS | Progress: (20/20) | 14.31 s Done.
+[Task 16/25]  Current/Best:   11.21/  19.50 GFLOPS | Progress: (4/20) | 5.14 s
+[Task 16/25]  Current/Best:    6.75/  19.50 GFLOPS | Progress: (8/20) | 7.81 s
+[Task 16/25]  Current/Best:   19.29/  19.50 GFLOPS | Progress: (12/20) | 9.79 s
+[Task 16/25]  Current/Best:   20.74/  20.74 GFLOPS | Progress: (16/20) | 11.46 s
+[Task 16/25]  Current/Best:    7.78/  21.01 GFLOPS | Progress: (20/20) | 14.52 s Done.
 
 [Task 17/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 17/25]  Current/Best:   11.97/  21.77 GFLOPS | Progress: (4/20) | 5.94 s
-[Task 17/25]  Current/Best:   21.08/  21.77 GFLOPS | Progress: (8/20) | 8.59 s
-[Task 17/25]  Current/Best:   19.87/  21.77 GFLOPS | Progress: (12/20) | 11.84 s
-[Task 17/25]  Current/Best:   19.02/  21.77 GFLOPS | Progress: (16/20) | 14.05 s
-[Task 17/25]  Current/Best:   17.72/  21.77 GFLOPS | Progress: (20/20) | 18.12 s Done.
+[Task 17/25]  Current/Best:   12.75/  22.53 GFLOPS | Progress: (4/20) | 5.44 s
+[Task 17/25]  Current/Best:   20.86/  22.95 GFLOPS | Progress: (8/20) | 9.06 s
+[Task 17/25]  Current/Best:   12.20/  23.64 GFLOPS | Progress: (12/20) | 11.90 s
+[Task 17/25]  Current/Best:   14.40/  23.64 GFLOPS | Progress: (16/20) | 14.27 s
+[Task 17/25]  Current/Best:   23.02/  23.71 GFLOPS | Progress: (20/20) | 18.23 s Done.
 
 [Task 18/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 18/25]  Current/Best:    7.14/   9.34 GFLOPS | Progress: (4/20) | 6.83 s
-[Task 18/25]  Current/Best:    7.62/  17.35 GFLOPS | Progress: (8/20) | 14.49 s
-[Task 18/25]  Current/Best:   10.98/  20.59 GFLOPS | Progress: (12/20) | 16.68 s
-[Task 18/25]  Current/Best:    9.43/  20.59 GFLOPS | Progress: (16/20) | 21.59 s
-[Task 18/25]  Current/Best:   14.67/  20.59 GFLOPS | Progress: (20/20) | 23.63 s Done.
+[Task 18/25]  Current/Best:    3.18/  15.08 GFLOPS | Progress: (4/20) | 6.27 s
+[Task 18/25]  Current/Best:   19.23/  19.23 GFLOPS | Progress: (8/20) | 8.76 s
+[Task 18/25]  Current/Best:   13.78/  19.23 GFLOPS | Progress: (12/20) | 11.19 s
+[Task 18/25]  Current/Best:    3.18/  19.23 GFLOPS | Progress: (16/20) | 19.50 s
+[Task 18/25]  Current/Best:   16.95/  19.23 GFLOPS | Progress: (20/20) | 21.35 s Done.
 
 [Task 19/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 19/25]  Current/Best:    5.34/  18.71 GFLOPS | Progress: (4/20) | 6.84 s
-[Task 19/25]  Current/Best:    8.29/  20.47 GFLOPS | Progress: (8/20) | 10.01 s
-[Task 19/25]  Current/Best:   10.68/  20.47 GFLOPS | Progress: (12/20) | 13.18 s
-[Task 19/25]  Current/Best:   12.87/  21.12 GFLOPS | Progress: (16/20) | 15.89 s
-[Task 19/25]  Current/Best:    6.76/  21.12 GFLOPS | Progress: (20/20) | 20.55 s Done.
+[Task 19/25]  Current/Best:    2.79/  19.12 GFLOPS | Progress: (4/20) | 7.20 s
+[Task 19/25]  Current/Best:   20.76/  22.59 GFLOPS | Progress: (8/20) | 9.66 s
+[Task 19/25]  Current/Best:    9.34/  22.59 GFLOPS | Progress: (12/20) | 12.96 s
+[Task 19/25]  Current/Best:   12.16/  22.88 GFLOPS | Progress: (16/20) | 17.34 s
+[Task 19/25]  Current/Best:    9.35/  22.88 GFLOPS | Progress: (20/20) | 23.84 s Done.
 
 [Task 20/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 20/25]  Current/Best:   11.68/  18.78 GFLOPS | Progress: (4/20) | 9.29 s
-[Task 20/25]  Current/Best:    5.27/  18.78 GFLOPS | Progress: (8/20) | 20.97 s
-[Task 20/25]  Current/Best:   11.12/  18.78 GFLOPS | Progress: (12/20) | 33.89 s
-[Task 20/25]  Current/Best:   14.92/  18.78 GFLOPS | Progress: (16/20) | 45.26 s
-[Task 20/25]  Current/Best:    7.93/  18.78 GFLOPS | Progress: (20/20) | 58.12 s
-[Task 21/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s Done.
-
-[Task 21/25]  Current/Best:   10.60/  10.60 GFLOPS | Progress: (4/20) | 15.16 s
-[Task 21/25]  Current/Best:   16.31/  18.06 GFLOPS | Progress: (8/20) | 26.22 s
-[Task 21/25]  Current/Best:    4.82/  18.06 GFLOPS | Progress: (12/20) | 31.63 s
-[Task 21/25]  Current/Best:   19.27/  19.27 GFLOPS | Progress: (16/20) | 34.43 s
-[Task 21/25]  Current/Best:    9.97/  19.27 GFLOPS | Progress: (20/20) | 45.72 s
+[Task 20/25]  Current/Best:   12.61/  17.02 GFLOPS | Progress: (4/20) | 6.61 s
+[Task 20/25]  Current/Best:    7.21/  18.64 GFLOPS | Progress: (8/20) | 12.47 s
+[Task 20/25]  Current/Best:    9.08/  19.81 GFLOPS | Progress: (12/20) | 16.84 s
+[Task 20/25]  Current/Best:   10.01/  19.81 GFLOPS | Progress: (16/20) | 22.09 s
+[Task 20/25]  Current/Best:   10.39/  19.81 GFLOPS | Progress: (20/20) | 29.01 s Done.
+
+[Task 21/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
+[Task 21/25]  Current/Best:   11.01/  23.32 GFLOPS | Progress: (4/20) | 13.53 s
+[Task 21/25]  Current/Best:    5.52/  23.32 GFLOPS | Progress: (8/20) | 16.43 s
+[Task 21/25]  Current/Best:   16.53/  23.32 GFLOPS | Progress: (12/20) | 27.18 s Done.
+
+[Task 21/25]  Current/Best:    9.79/  23.32 GFLOPS | Progress: (16/20) | 30.73 s
+[Task 21/25]  Current/Best:    7.26/  23.32 GFLOPS | Progress: (20/20) | 42.09 s
 [Task 22/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 22/25]  Current/Best:   12.58/  20.63 GFLOPS | Progress: (4/20) | 5.54 s
-[Task 22/25]  Current/Best:   21.03/  21.90 GFLOPS | Progress: (8/20) | 7.12 s
-[Task 22/25]  Current/Best:    5.01/  21.90 GFLOPS | Progress: (12/20) | 13.98 s
-[Task 22/25]  Current/Best:   13.10/  21.90 GFLOPS | Progress: (16/20) | 15.85 s
-[Task 22/25]  Current/Best:   11.89/  21.90 GFLOPS | Progress: (20/20) | 17.78 s Done.
+[Task 22/25]  Current/Best:   15.84/  19.32 GFLOPS | Progress: (4/20) | 5.35 s
+[Task 22/25]  Current/Best:   11.24/  20.77 GFLOPS | Progress: (8/20) | 7.35 s
+[Task 22/25]  Current/Best:   19.66/  20.77 GFLOPS | Progress: (12/20) | 9.30 s
+[Task 22/25]  Current/Best:    8.19/  20.77 GFLOPS | Progress: (16/20) | 11.39 s
+[Task 22/25]  Current/Best:    8.46/  20.77 GFLOPS | Progress: (20/20) | 15.79 s Done.
 
 [Task 23/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 23/25]  Current/Best:    1.55/  20.74 GFLOPS | Progress: (4/20) | 7.46 s
-[Task 23/25]  Current/Best:   22.73/  22.73 GFLOPS | Progress: (8/20) | 10.46 s
-[Task 23/25]  Current/Best:   12.07/  22.73 GFLOPS | Progress: (12/20) | 13.11 s
-[Task 23/25]  Current/Best:   16.75/  22.73 GFLOPS | Progress: (16/20) | 16.55 s
-[Task 23/25]  Current/Best:   12.34/  22.73 GFLOPS | Progress: (20/20) | 20.19 s Done.
+[Task 23/25]  Current/Best:   24.05/  24.05 GFLOPS | Progress: (4/20) | 8.07 s
+[Task 23/25]  Current/Best:   10.32/  24.05 GFLOPS | Progress: (8/20) | 12.19 s
+[Task 23/25]  Current/Best:   24.79/  24.79 GFLOPS | Progress: (12/20) | 15.07 s
+[Task 23/25]  Current/Best:   15.39/  24.79 GFLOPS | Progress: (16/20) | 19.25 s
+[Task 23/25]  Current/Best:   19.78/  24.79 GFLOPS | Progress: (20/20) | 22.61 s Done.
 
 [Task 24/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 24/25]  Current/Best:    8.85/   8.85 GFLOPS | Progress: (4/20) | 7.53 s
-[Task 24/25]  Current/Best:    2.10/   8.85 GFLOPS | Progress: (8/20) | 18.56 s
-[Task 24/25]  Current/Best:    0.95/   8.85 GFLOPS | Progress: (12/20) | 29.62 s
-[Task 24/25]  Current/Best:    9.59/   9.59 GFLOPS | Progress: (16/20) | 42.01 s Done.
-
-[Task 24/25]  Current/Best:    3.98/   9.59 GFLOPS | Progress: (20/20) | 53.01 s
+[Task 24/25]  Current/Best:    2.59/   8.43 GFLOPS | Progress: (4/20) | 6.51 s
+[Task 24/25]  Current/Best:    5.50/   8.43 GFLOPS | Progress: (8/20) | 16.91 s
+[Task 24/25]  Current/Best:    3.18/  10.59 GFLOPS | Progress: (12/20) | 29.50 s
+[Task 24/25]  Current/Best:    3.40/  10.59 GFLOPS | Progress: (16/20) | 40.52 s
+[Task 24/25]  Current/Best:    3.78/  10.59 GFLOPS | Progress: (20/20) | 51.58 s
 [Task 25/25]  Current/Best:    0.00/   0.00 GFLOPS | Progress: (0/20) | 0.00 s
-[Task 25/25]  Current/Best:    7.72/   7.72 GFLOPS | Progress: (4/20) | 7.65 s
-[Task 25/25]  Current/Best:    1.54/   9.24 GFLOPS | Progress: (8/20) | 10.60 s
-[Task 25/25]  Current/Best:    2.99/   9.24 GFLOPS | Progress: (12/20) | 15.61 s
-[Task 25/25]  Current/Best:    5.55/   9.24 GFLOPS | Progress: (16/20) | 17.82 s
-[Task 25/25]  Current/Best:    1.52/   9.24 GFLOPS | Progress: (20/20) | 28.51 s
+[Task 25/25]  Current/Best:    1.50/   9.71 GFLOPS | Progress: (4/20) | 13.42 s Done.
+ Done.
+
+[Task 25/25]  Current/Best:    1.60/   9.71 GFLOPS | Progress: (8/20) | 20.87 s
+[Task 25/25]  Current/Best:   10.61/  10.61 GFLOPS | Progress: (12/20) | 31.47 s
+[Task 25/25]  Current/Best:    5.74/  10.61 GFLOPS | Progress: (16/20) | 34.57 s
+[Task 25/25]  Current/Best:   10.23/  10.61 GFLOPS | Progress: (20/20) | 45.52 s
 </pre></div>
 </div>
 <p>The output from this tuning process will look something like this:</p>
@@ -986,7 +987,6 @@ model using optimized operators to speed up our computations.</p>
 </pre></div>
 </div>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Done.
-Done.
 </pre></div>
 </div>
 <p>Verify that the optimized model runs and produces the same results:</p>
@@ -1041,8 +1041,8 @@ improvement in comparing the optimized model to the unoptimized model.</p>
 <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;unoptimized: </span><span class="si">%s</span><span class="s2">&quot;</span> <span class="o">%</span> <span class="p">(</span><a href="https://docs.python.org/3/library/stdtypes.html#dict" title="builtins.dict" class="sphx-glr-backref-module-builtins sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class="n">unoptimized</span></a><span class="p">))</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>optimized: {&#39;mean&#39;: 415.74742305001564, &#39;median&#39;: 415.8195728499777, &#39;std&#39;: 1.7602679764603735}
-unoptimized: {&#39;mean&#39;: 492.71140236000065, &#39;median&#39;: 492.71483809998244, &#39;std&#39;: 3.6515681597064877}
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>optimized: {&#39;mean&#39;: 379.3111107200457, &#39;median&#39;: 378.54401690001396, &#39;std&#39;: 2.0323214236615597}
+unoptimized: {&#39;mean&#39;: 466.5144750900072, &#39;median&#39;: 466.5101404500092, &#39;std&#39;: 1.2822160389899164}
 </pre></div>
 </div>
 </div>
@@ -1056,7 +1056,7 @@ models.</p>
 <p>Here we presented a simple example using ResNet-50 v2 locally. However, TVM
 supports many more features including cross-compilation, remote execution and
 profiling/benchmarking.</p>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 13 minutes  55.849 seconds)</p>
+<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 13 minutes  22.695 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-tutorial-autotvm-relay-x86-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../_downloads/57a45d9bef1af358191e7d50043e652c/autotvm_relay_x86.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">autotvm_relay_x86.py</span></code></a></p>
diff --git a/docs/tutorial/cross_compilation_and_rpc.html b/docs/tutorial/cross_compilation_and_rpc.html
index 7f7e9a6911..86d10e1508 100644
--- a/docs/tutorial/cross_compilation_and_rpc.html
+++ b/docs/tutorial/cross_compilation_and_rpc.html
@@ -558,7 +558,7 @@ device and returns the measured cost. Network overhead is excluded.</p>
 <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;</span><span class="si">%g</span><span class="s2"> secs/op&quot;</span> <span class="o">%</span> <span class="n">cost</span><span class="p">)</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>1.269e-07 secs/op
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>1.207e-07 secs/op
 </pre></div>
 </div>
 </div>
diff --git a/docs/tutorial/intro_topi.html b/docs/tutorial/intro_topi.html
index 28446f3f09..677976e629 100644
--- a/docs/tutorial/intro_topi.html
+++ b/docs/tutorial/intro_topi.html
@@ -528,7 +528,7 @@ class Module:
 <div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="nb">print</span><span class="p">(</span><a href="../reference/api/python/ir.html#tvm.ir.Array" title="tvm.ir.Array" class="sphx-glr-backref-module-tvm-ir sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class="n">sg</span><span class="o">.</span><span class="n">stages</span></a><span class="p">)</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>[stage(a, placeholder(a, 0x2d9615b0)), stage(b, placeholder(b, 0x18f334d0)), stage(T_add, compute(T_add, body=[a[ax0, ax1, ax2] + b[ax1, ax2]], axis=[T.iter_var(ax0, T.Range(0, 100), &quot;DataPar&quot;, &quot;&quot;), T.iter_var(ax1, T.Range(0, 10), &quot;DataPar&quot;, &quot;&quot;), T.iter_var(ax2, T.Range(0, 10), &quot;DataPar&quot;, &quot;&quot;)], reduce_axis=[], tag=broadcast, attr [...]
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>[stage(a, placeholder(a, 0x1081b4e0)), stage(b, placeholder(b, 0x107ff4b0)), stage(T_add, compute(T_add, body=[a[ax0, ax1, ax2] + b[ax1, ax2]], axis=[T.iter_var(ax0, T.Range(0, 100), &quot;DataPar&quot;, &quot;&quot;), T.iter_var(ax1, T.Range(0, 10), &quot;DataPar&quot;, &quot;&quot;), T.iter_var(ax2, T.Range(0, 10), &quot;DataPar&quot;, &quot;&quot;)], reduce_axis=[], tag=broadcast, attr [...]
 </pre></div>
 </div>
 <p>We can test the correctness by comparing with <code class="code docutils literal notranslate"><span class="pre">numpy</span></code> result as follows</p>
diff --git a/docs/tutorial/sg_execution_times.html b/docs/tutorial/sg_execution_times.html
index 14305bcc59..6a35a3fa7d 100644
--- a/docs/tutorial/sg_execution_times.html
+++ b/docs/tutorial/sg_execution_times.html
@@ -360,7 +360,7 @@
             
   <div class="section" id="computation-times">
 <span id="sphx-glr-tutorial-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline">¶</a></h1>
-<p><strong>17:32.525</strong> total execution time for <strong>tutorial</strong> files:</p>
+<p><strong>16:46.661</strong> total execution time for <strong>tutorial</strong> files:</p>
 <table class="docutils align-default">
 <colgroup>
 <col style="width: 83%" />
@@ -369,35 +369,35 @@
 </colgroup>
 <tbody>
 <tr class="row-odd"><td><p><a class="reference internal" href="autotvm_relay_x86.html#sphx-glr-tutorial-autotvm-relay-x86-py"><span class="std std-ref">Compiling and Optimizing a Model with the Python Interface (AutoTVM)</span></a> (<code class="docutils literal notranslate"><span class="pre">autotvm_relay_x86.py</span></code>)</p></td>
-<td><p>13:55.849</p></td>
+<td><p>13:22.695</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="auto_scheduler_matmul_x86.html#sphx-glr-tutorial-auto-scheduler-matmul-x86-py"><span class="std std-ref">Optimizing Operators with Auto-scheduling</span></a> (<code class="docutils literal notranslate"><span class="pre">auto_scheduler_matmul_x86.py</span></code>)</p></td>
-<td><p>01:31.186</p></td>
+<td><p>01:30.520</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="tensor_expr_get_started.html#sphx-glr-tutorial-tensor-expr-get-started-py"><span class="std std-ref">Working with Operators Using Tensor Expression</span></a> (<code class="docutils literal notranslate"><span class="pre">tensor_expr_get_started.py</span></code>)</p></td>
-<td><p>01:01.600</p></td>
+<td><p>00:58.454</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="relay_quick_start.html#sphx-glr-tutorial-relay-quick-start-py"><span class="std std-ref">Quick Start Tutorial for Compiling Deep Learning Models</span></a> (<code class="docutils literal notranslate"><span class="pre">relay_quick_start.py</span></code>)</p></td>
-<td><p>00:42.693</p></td>
+<td><p>00:40.197</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="autotvm_matmul_x86.html#sphx-glr-tutorial-autotvm-matmul-x86-py"><span class="std std-ref">Optimizing Operators with Schedule Templates and AutoTVM</span></a> (<code class="docutils literal notranslate"><span class="pre">autotvm_matmul_x86.py</span></code>)</p></td>
-<td><p>00:19.107</p></td>
+<td><p>00:12.681</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
-<tr class="row-even"><td><p><a class="reference internal" href="intro_topi.html#sphx-glr-tutorial-intro-topi-py"><span class="std std-ref">Introduction to TOPI</span></a> (<code class="docutils literal notranslate"><span class="pre">intro_topi.py</span></code>)</p></td>
-<td><p>00:00.970</p></td>
+<tr class="row-even"><td><p><a class="reference internal" href="tensor_ir_blitz_course.html#sphx-glr-tutorial-tensor-ir-blitz-course-py"><span class="std std-ref">Blitz Course to TensorIR</span></a> (<code class="docutils literal notranslate"><span class="pre">tensor_ir_blitz_course.py</span></code>)</p></td>
+<td><p>00:00.989</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
-<tr class="row-odd"><td><p><a class="reference internal" href="tensor_ir_blitz_course.html#sphx-glr-tutorial-tensor-ir-blitz-course-py"><span class="std std-ref">Blitz Course to TensorIR</span></a> (<code class="docutils literal notranslate"><span class="pre">tensor_ir_blitz_course.py</span></code>)</p></td>
-<td><p>00:00.921</p></td>
+<tr class="row-odd"><td><p><a class="reference internal" href="intro_topi.html#sphx-glr-tutorial-intro-topi-py"><span class="std std-ref">Introduction to TOPI</span></a> (<code class="docutils literal notranslate"><span class="pre">intro_topi.py</span></code>)</p></td>
+<td><p>00:00.929</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-even"><td><p><a class="reference internal" href="cross_compilation_and_rpc.html#sphx-glr-tutorial-cross-compilation-and-rpc-py"><span class="std std-ref">Cross Compilation and RPC</span></a> (<code class="docutils literal notranslate"><span class="pre">cross_compilation_and_rpc.py</span></code>)</p></td>
-<td><p>00:00.197</p></td>
+<td><p>00:00.196</p></td>
 <td><p>0.0 MB</p></td>
 </tr>
 <tr class="row-odd"><td><p><a class="reference internal" href="uma.html#sphx-glr-tutorial-uma-py"><span class="std std-ref">Making your Hardware Accelerator TVM-ready with UMA</span></a> (<code class="docutils literal notranslate"><span class="pre">uma.py</span></code>)</p></td>
diff --git a/docs/tutorial/tensor_expr_get_started.html b/docs/tutorial/tensor_expr_get_started.html
index e26f4d4df2..7fdfe4854d 100644
--- a/docs/tutorial/tensor_expr_get_started.html
+++ b/docs/tutorial/tensor_expr_get_started.html
@@ -569,8 +569,8 @@ helper function to run a profile of the TVM generated code.</p>
 <span class="n">evaluate_addition</span><span class="p">(</span><span class="n">fadd</span><span class="p">,</span> <a href="../reference/api/python/target.html#tvm.target.Target" title="tvm.target.Target" class="sphx-glr-backref-module-tvm-target sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class="n">tgt</span></a><span class="p">,</span> <span class="s2">&quot;naive&quot;</span><span class="p">,</span> <a href="https://docs.python.org/3/library/stdtypes.html#list" ti [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Numpy running time: 0.000008
-naive: 0.000008
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Numpy running time: 0.000006
+naive: 0.000006
 </pre></div>
 </div>
 </div>
@@ -664,7 +664,7 @@ factor to be the number of threads on your CPU.</p>
 <span class="nb">print</span><span class="p">(</span><a href="../reference/api/python/driver.html#tvm.lower" title="tvm.lower" class="sphx-glr-backref-module-tvm sphx-glr-backref-type-py-function"><span class="n">tvm</span><span class="o">.</span><span class="n">lower</span></a><span class="p">(</span><a href="../reference/api/python/te.html#tvm.te.Schedule" title="tvm.te.Schedule" class="sphx-glr-backref-module-tvm-te sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>vector: 0.000039
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>vector: 0.000038
 # from tvm.script import ir as I
 # from tvm.script import tir as T
 
@@ -701,10 +701,10 @@ class Module:
 </pre></div>
 </div>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Operator                  Timing             Performance
-   numpy    7.671109997318127e-06                    1.0
-   naive    7.830600000000001e-06      1.020790994098329
-parallel    7.766199999999999e-06     1.0123958596233291
-  vector             3.92464e-05       5.116130522665014
+   numpy    5.601880002359394e-06                    1.0
+   naive              5.6479e-06      1.0082150987920524
+parallel    7.738200000000002e-06      1.381357686480402
+  vector    3.7925999999999996e-05     6.770227135180754
 </pre></div>
 </div>
 <div class="admonition-code-specialization admonition">
@@ -1020,7 +1020,7 @@ matrix multiplication.</p>
 <span class="n">answer</span> <span class="o">=</span> <span class="n">numpy</span><span class="o">.</span><span class="n">dot</span><span class="p">(</span><span class="n">a</span><span class="o">.</span><span class="n">numpy</span><span class="p">(),</span> <span class="n">b</span><span class="o">.</span><span class="n">numpy</span><span class="p">())</span>
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Numpy running time: 0.018811
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Numpy running time: 0.013683
 </pre></div>
 </div>
 <p>Now we write a basic matrix multiplication using TVM TE and verify that it
@@ -1061,7 +1061,7 @@ optimizations.</p>
 <span class="n">evaluate_operation</span><span class="p">(</span><a href="../reference/api/python/te.html#tvm.te.Schedule" title="tvm.te.Schedule" class="sphx-glr-backref-module-tvm-te sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class="n">s</span></a><span class="p">,</span> <span class="p">[</span><a href="../reference/api/python/te.html#tvm.te.Tensor" title="tvm.te.Tensor" class="sphx-glr-backref-module-tvm-te sphx-glr-backref-type-py-class sphx-glr-backref-instance [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>none: 3.526149
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>none: 3.380417
 </pre></div>
 </div>
 <p>Let’s take a look at the intermediate representation of the operator and
@@ -1125,7 +1125,7 @@ schedule.</p>
 <span class="n">evaluate_operation</span><span class="p">(</span><a href="../reference/api/python/te.html#tvm.te.Schedule" title="tvm.te.Schedule" class="sphx-glr-backref-module-tvm-te sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class="n">s</span></a><span class="p">,</span> <span class="p">[</span><a href="../reference/api/python/te.html#tvm.te.Tensor" title="tvm.te.Tensor" class="sphx-glr-backref-module-tvm-te sphx-glr-backref-type-py-class sphx-glr-backref-instance [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>blocking: 0.287237
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>blocking: 0.290640
 </pre></div>
 </div>
 <p>By reordering the computation to take advantage of caching, you should see a
@@ -1174,7 +1174,7 @@ already cache friendly from our previous optimizations.</p>
 <span class="nb">print</span><span class="p">(</span><a href="../reference/api/python/driver.html#tvm.lower" title="tvm.lower" class="sphx-glr-backref-module-tvm sphx-glr-backref-type-py-function"><span class="n">tvm</span><span class="o">.</span><span class="n">lower</span></a><span class="p">(</span><a href="../reference/api/python/te.html#tvm.te.Schedule" title="tvm.te.Schedule" class="sphx-glr-backref-module-tvm-te sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>vectorization: 0.272752
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>vectorization: 0.265125
 # from tvm.script import ir as I
 # from tvm.script import tir as T
 
@@ -1223,7 +1223,7 @@ more cache friendly.</p>
 <span class="nb">print</span><span class="p">(</span><a href="../reference/api/python/driver.html#tvm.lower" title="tvm.lower" class="sphx-glr-backref-module-tvm sphx-glr-backref-type-py-function"><span class="n">tvm</span><span class="o">.</span><span class="n">lower</span></a><span class="p">(</span><a href="../reference/api/python/te.html#tvm.te.Schedule" title="tvm.te.Schedule" class="sphx-glr-backref-module-tvm-te sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>loop permutation: 0.118667
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>loop permutation: 0.112804
 # from tvm.script import ir as I
 # from tvm.script import tir as T
 
@@ -1293,7 +1293,7 @@ optimized schedule.</p>
 <span class="nb">print</span><span class="p">(</span><a href="../reference/api/python/driver.html#tvm.lower" title="tvm.lower" class="sphx-glr-backref-module-tvm sphx-glr-backref-type-py-function"><span class="n">tvm</span><span class="o">.</span><span class="n">lower</span></a><span class="p">(</span><a href="../reference/api/python/te.html#tvm.te.Schedule" title="tvm.te.Schedule" class="sphx-glr-backref-module-tvm-te sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>array packing: 0.107560
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>array packing: 0.103097
 # from tvm.script import ir as I
 # from tvm.script import tir as T
 
@@ -1359,7 +1359,7 @@ to `C</cite> when all the block results are ready.</p>
 <span class="nb">print</span><span class="p">(</span><a href="../reference/api/python/driver.html#tvm.lower" title="tvm.lower" class="sphx-glr-backref-module-tvm sphx-glr-backref-type-py-function"><span class="n">tvm</span><span class="o">.</span><span class="n">lower</span></a><span class="p">(</span><a href="../reference/api/python/te.html#tvm.te.Schedule" title="tvm.te.Schedule" class="sphx-glr-backref-module-tvm-te sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>block caching: 0.111709
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>block caching: 0.094137
 # from tvm.script import ir as I
 # from tvm.script import tir as T
 
@@ -1416,7 +1416,7 @@ of thread-level parallelization.</p>
 <span class="nb">print</span><span class="p">(</span><a href="../reference/api/python/driver.html#tvm.lower" title="tvm.lower" class="sphx-glr-backref-module-tvm sphx-glr-backref-type-py-function"><span class="n">tvm</span><span class="o">.</span><span class="n">lower</span></a><span class="p">(</span><a href="../reference/api/python/te.html#tvm.te.Schedule" title="tvm.te.Schedule" class="sphx-glr-backref-module-tvm-te sphx-glr-backref-type-py-class sphx-glr-backref-instance"><span class [...]
 </pre></div>
 </div>
-<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>parallelization: 0.132621
+<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>parallelization: 0.112458
 # from tvm.script import ir as I
 # from tvm.script import tir as T
 
@@ -1469,13 +1469,13 @@ working, we can compare the results.</p>
 </pre></div>
 </div>
 <div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>        Operator                  Timing             Performance
-            none      3.5261488836999995                     1.0
-        blocking     0.28723746299999997     0.08145925554300497
-   vectorization            0.2727517516      0.0773511727938727
-loop permutation             0.118667046     0.03365344173314721
-   array packing            0.1075599755    0.030503526381772336
-   block caching            0.1117090457    0.031680184071746664
- parallelization     0.13262143129999998     0.03761084278461886
+            none      3.3804169742999997                     1.0
+        blocking     0.29064017789999996     0.08597761167028346
+   vectorization     0.26512477300000004     0.07842960647033811
+loop permutation            0.1128040781    0.033369870923500175
+   array packing     0.10309722699999999    0.030498375728144858
+   block caching            0.0941365962     0.02784762853685923
+ parallelization            0.1124583613     0.03326760046319059
 </pre></div>
 </div>
 <p>Note that the outputs on the web page reflect the running times on a
@@ -1507,7 +1507,6 @@ is</p>
 you can build generic templates of the matrix multiplication and other
 operations with tunable parameters that allows you to automatically optimize
 the computation for specific platforms.</p>
-<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes  1.600 seconds)</p>
 <div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-tutorial-tensor-expr-get-started-py">
 <div class="sphx-glr-download sphx-glr-download-python docutils container">
 <p><a class="reference download internal" download="" href="../_downloads/40a01cffb015a67aaec0fad7e27cf80d/tensor_expr_get_started.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">tensor_expr_get_started.py</span></code></a></p>