You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2020/06/23 03:50:05 UTC

[GitHub] [incubator-tvm] leonwanghui opened a new pull request #5892: Add TVM application extension with WASM runtime

leonwanghui opened a new pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892


   ## Background
   As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized operator backend for all deep learning frameworks.
   
   ## Proposal Summary
   This PR is mainly proposed to add new TVM application extension that provides WASM operator backend for deep learning frameworks with TVM runtime. We believe it would bring some new ideas by combining WASM runtime with TVM compiler stack.
   
   Thanks for reviewing this proposal, considering it's at the **experimental** stage to incubation, so any suggestions or questions are welcomed. And we wish more contributors can join in to make it grow up.
   
   @tqchen @jroesch @kazum PTAL, thanks!
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-663316322


   cc @tqchen @nhynes , please notice that some of `clippy` warnings in `tvm-graph-rt` module would not be addressed in this PR.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] nhynes commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
nhynes commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r457000729



##########
File path: apps/wasm-standalone/wasm-graph/src/lib.rs
##########
@@ -0,0 +1,83 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+#[macro_use]
+extern crate lazy_static;
+#[macro_use]
+extern crate serde_derive;
+extern crate ndarray;
+extern crate tvm_runtime;

Review comment:
       `edition = 2018` doesn't require `extern crate` declarations.

##########
File path: apps/wasm-standalone/wasm-graph/src/lib.rs
##########
@@ -0,0 +1,83 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+#[macro_use]
+extern crate lazy_static;
+#[macro_use]
+extern crate serde_derive;
+extern crate ndarray;
+extern crate tvm_runtime;
+
+mod types;
+use types::Tensor;

Review comment:
       I usually format imports as 
   
   ```rust
   #[macro_use]
   extern crate extern_crate;
   
   mod internal_module;
   
   use std::any;
   
   use extern_crate::module;
   
   use internal_module::inner;
   ```
   this matches how google does python imports

##########
File path: apps/wasm-standalone/wasm-graphruntime/src/main.rs
##########
@@ -0,0 +1,154 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+#[macro_use]
+extern crate serde_derive;
+
+pub mod types;
+use types::Tensor;
+mod runtime;
+
+use getopts::Options;
+use image::{FilterType, GenericImageView};
+use ndarray::Array;
+use std::{collections::HashMap, env, fs::File, io::BufReader};
+
+const IMG_HEIGHT: usize = 224;
+const IMG_WIDTH: usize = 224;
+
+fn print_usage(program: &str, opts: Options) {

Review comment:
       I get the desire for minimalism, but [clap](https://crates.io/crates/clap) (especially the `clap_app` macro) would really make the code a lot cleaner.

##########
File path: apps/wasm-standalone/wasm-graphruntime/src/main.rs
##########
@@ -0,0 +1,154 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+#[macro_use]
+extern crate serde_derive;
+
+pub mod types;
+use types::Tensor;
+mod runtime;
+
+use getopts::Options;
+use image::{FilterType, GenericImageView};
+use ndarray::Array;
+use std::{collections::HashMap, env, fs::File, io::BufReader};
+
+const IMG_HEIGHT: usize = 224;
+const IMG_WIDTH: usize = 224;
+
+fn print_usage(program: &str, opts: Options) {
+    let brief = format!("Usage: {} [options]", program);
+    print!("{}", opts.usage(&brief));
+}
+
+fn main() {
+    let args: Vec<String> = env::args().collect();
+    let program = args[0].clone();
+
+    let mut opts = Options::new();
+    opts.optopt(
+        "g",
+        "wasm-graph-file",
+        "set the path to wasm graph file",
+        "FILE_PATH",
+    );
+    opts.optopt(
+        "i",
+        "input-data-file",
+        "set the path to input image file",
+        "FILE_PATH",
+    );
+    opts.optopt(
+        "l",
+        "label-class-file",
+        "set the path to label class file",
+        "FILE_PATH",
+    );
+    opts.optflag("h", "help", "print this help menu");
+    let matches = match opts.parse(&args[1..]) {
+        Ok(m) => m,
+        Err(f) => panic!(f.to_string()),
+    };
+    if matches.opt_present("h") {
+        print_usage(&program, opts);
+        return;
+    }
+    let wasm_graph_file: String = match matches.opt_str("g") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let input_data_file: String = match matches.opt_str("i") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let label_class_file: String = match matches.opt_str("l") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let img = image::open(input_data_file).unwrap();
+    let input = data_preprocess(img);
+
+    let output: Tensor = match runtime::execute(wasm_graph_file, input) {
+        Ok(m) => m,
+        Err(f) => panic!(f.to_string()),
+    };
+    output_assert(output, label_class_file);
+}
+
+fn data_preprocess(img: image::DynamicImage) -> Tensor {
+    println!("original image dimensions: {:?}", img.dimensions());
+    let img = img
+        .resize_exact(IMG_HEIGHT as u32, IMG_WIDTH as u32, FilterType::Nearest)
+        .to_rgb();
+    println!("resized image dimensions: {:?}", img.dimensions());
+    let mut pixels: Vec<f32> = vec![];

Review comment:
       `Vec::with_capacity(img.len())` you already know the size. this is a useless micro-optimization, though 😉 

##########
File path: apps/wasm-standalone/wasm-graph/Cargo.toml
##########
@@ -0,0 +1,43 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+[package]
+name = "wasm-graph"
+version = "0.1.0"
+authors = ["TVM Contributors"]
+edition = "2018"
+description = "WebAssembly graph to deep learning frameworks using TVM"
+readme = "README.md"
+repository = "https://github.com/apache/incubator-tvm"
+license = "Apache-2.0"
+keywords = ["wasm", "machine learning", "tvm"]
+
+[profile.release]
+lto = true
+opt-level = 's'
+
+[lib]
+crate-type = ['cdylib']
+
+[dependencies]
+serde = "1.0.53"
+serde_derive = "1.0.53"
+serde_json = "1.0.53"
+ndarray = "0.12"
+tvm-common = { version = "0.1", path = "../../../rust/common" }
+tvm-runtime = { version = "0.1", path = "../../../rust/runtime" }
+lazy_static = "1.1.1"

Review comment:
       patch versions are superfluous, as cargo [defaults to carat](https://doc.rust-lang.org/cargo/reference/specifying-dependencies.html#caret-requirements)

##########
File path: apps/wasm-standalone/wasm-graph/tools/build_graph_lib.py
##########
@@ -0,0 +1,73 @@
+#!/usr/bin/env python3
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""Builds a simple graph for testing."""
+import argparse
+import os
+import subprocess
+import sys
+
+import onnx
+import tvm
+from tvm import relay
+
+
+def _get_mod_and_params(model_file):
+    onnx_model = onnx.load(model_file)
+    shape_dict = {}
+    for input in onnx_model.graph.input:
+        shape_dict[input.name] = [dim.dim_value for dim in input.type.tensor_type.shape.dim]
+
+    return relay.frontend.from_onnx(onnx_model, shape_dict)
+
+
+def build_graph_lib(model_file, opt_level):
+    """Compiles the pre-trained model with TVM"""
+    out_dir = os.path.join(sys.path[0], "../lib")
+    if not os.path.exists(out_dir):
+        os.makedirs(out_dir)
+
+    # Compile the relay mod
+    mod, params = _get_mod_and_params(model_file)
+    target = 'llvm -target=wasm32-unknown-unknown -mattr=+simd128 --system-lib'

Review comment:
       cool simd!

##########
File path: apps/wasm-standalone/wasm-graph/src/types.rs
##########
@@ -0,0 +1,183 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use std::{
+    any::TypeId,
+    convert::From,
+    os::raw::{c_int, c_void},
+    slice,
+};
+pub use tvm_common::ffi::DLTensor;
+use tvm_common::ffi::{
+    DLContext, DLDataType, DLDataTypeCode_kDLFloat, DLDataTypeCode_kDLInt, DLDeviceType_kDLCPU,
+};
+
+#[derive(Debug, PartialEq, Clone, Serialize, Deserialize)]
+pub enum DataType {
+    FP32,
+    INT32,
+    INT8,
+}
+
+impl DataType {
+    pub fn as_dldtype(&self) -> DLDataType {
+        match self {
+            DataType::INT32 => DLDataType {
+                code: DLDataTypeCode_kDLInt as u8,
+                bits: 32u8,
+                lanes: 1u16,
+            },
+            DataType::INT8 => DLDataType {
+                code: DLDataTypeCode_kDLInt as u8,
+                bits: 8u8,
+                lanes: 1u16,
+            },
+            DataType::FP32 => DLDataType {
+                code: DLDataTypeCode_kDLFloat as u8,
+                bits: 32u8,
+                lanes: 1u16,
+            },
+        }
+    }
+
+    /// Returns whether this `DataType` represents primitive type `T`.
+    pub fn is_type<T: 'static>(&self) -> bool {
+        let typ = TypeId::of::<T>();
+        typ == TypeId::of::<i32>() || typ == TypeId::of::<i8>() || typ == TypeId::of::<f32>()
+    }
+}
+
+impl From<DLDataType> for DataType {
+    fn from(dl_dtype: DLDataType) -> Self {
+        if dl_dtype.code == DLDataTypeCode_kDLInt as u8 && dl_dtype.bits == 32u8 {
+            DataType::INT32
+        } else if dl_dtype.code == DLDataTypeCode_kDLInt as u8 && dl_dtype.bits == 8u8 {
+            DataType::INT8
+        } else if dl_dtype.code == DLDataTypeCode_kDLFloat as u8 && dl_dtype.bits == 32u8 {
+            DataType::FP32
+        } else {
+            DataType::FP32
+        }
+    }
+}
+
+#[derive(Debug, Clone, Serialize, Deserialize)]
+pub struct Tensor {

Review comment:
       yeah this is unfortunate. would you mind making the patch to the actual tensor? if it's something you find yourself needing, others would likely, too.

##########
File path: apps/wasm-standalone/wasm-graph/src/utils.rs
##########
@@ -0,0 +1,48 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use super::types::*;
+use serde_json;
+use std::ptr;
+
+pub fn load_input(in_addr: i32, in_size: usize) -> Tensor {

Review comment:
       a function that takes a raw pointer (albeit into linear memory) should likely be an `unsafe fn`

##########
File path: apps/wasm-standalone/wasm-graph/src/types.rs
##########
@@ -0,0 +1,183 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use std::{
+    any::TypeId,
+    convert::From,

Review comment:
       ```suggestion
   ```
   `From` is a default import

##########
File path: apps/wasm-standalone/wasm-graph/build.rs
##########
@@ -0,0 +1,27 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use std::path::PathBuf;
+
+fn main() {
+    let mut out_dir = PathBuf::from(env!("CARGO_MANIFEST_DIR"));
+    out_dir.push("lib");

Review comment:
       ```suggestion
   	let out_dir = concat!(env!("CARGO_MANIFEST_DIR"), "/lib");
   ```
   no need for runtime path construction

##########
File path: apps/wasm-standalone/wasm-graphruntime/src/main.rs
##########
@@ -0,0 +1,154 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+#[macro_use]
+extern crate serde_derive;
+
+pub mod types;
+use types::Tensor;
+mod runtime;
+
+use getopts::Options;
+use image::{FilterType, GenericImageView};
+use ndarray::Array;
+use std::{collections::HashMap, env, fs::File, io::BufReader};
+
+const IMG_HEIGHT: usize = 224;
+const IMG_WIDTH: usize = 224;
+
+fn print_usage(program: &str, opts: Options) {
+    let brief = format!("Usage: {} [options]", program);
+    print!("{}", opts.usage(&brief));
+}
+
+fn main() {
+    let args: Vec<String> = env::args().collect();
+    let program = args[0].clone();
+
+    let mut opts = Options::new();
+    opts.optopt(
+        "g",
+        "wasm-graph-file",
+        "set the path to wasm graph file",
+        "FILE_PATH",
+    );
+    opts.optopt(
+        "i",
+        "input-data-file",
+        "set the path to input image file",
+        "FILE_PATH",
+    );
+    opts.optopt(
+        "l",
+        "label-class-file",
+        "set the path to label class file",
+        "FILE_PATH",
+    );
+    opts.optflag("h", "help", "print this help menu");
+    let matches = match opts.parse(&args[1..]) {
+        Ok(m) => m,
+        Err(f) => panic!(f.to_string()),
+    };
+    if matches.opt_present("h") {
+        print_usage(&program, opts);
+        return;
+    }
+    let wasm_graph_file: String = match matches.opt_str("g") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let input_data_file: String = match matches.opt_str("i") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let label_class_file: String = match matches.opt_str("l") {
+        Some(s) => s,
+        None => String::from(""),

Review comment:
       `unwrap_or_default()`

##########
File path: apps/wasm-standalone/wasm-graphruntime/src/runtime.rs
##########
@@ -0,0 +1,74 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use super::types::Tensor;
+use anyhow::Result;
+use serde_json;

Review comment:
       ```suggestion
   ```
   already in the namespace, per 2018 semantics

##########
File path: tests/lint/check_file_type.py
##########
@@ -159,7 +160,7 @@ def copyright_line(line):
     if line.find("Copyright " + "(c)") != -1:
         return True
     if (line.find("Copyright") != -1 and
-        line.find(" by") != -1):
+            line.find(" by") != -1):

Review comment:
       this doesn't conflict with the existing lint rules? hmm

##########
File path: apps/wasm-standalone/wasm-graphruntime/Cargo.toml
##########
@@ -0,0 +1,37 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+[package]
+name = "wasm-graphruntime"
+version = "0.1.0"
+authors = ["TVM Contributors"]
+edition = "2018"
+description = "WebAssembly graph runtime to deep learning frameworks using wasmtime"
+license = "Apache-2.0"
+keywords = ["wasm", "machine learning", "wasmtime"]
+
+[dependencies]
+wasmtime = "0.16.0"
+wasmtime-wasi = "0.16.0"
+anyhow = "1.0.31"
+serde = "1.0.53"
+serde_json = "1.0.53"
+serde_derive = "1.0.53"
+getopts = "0.2.21"
+ndarray = "0.12"
+csv = "1.1"
+image = "0.20"

Review comment:
       sort these, please

##########
File path: apps/wasm-standalone/wasm-graphruntime/src/main.rs
##########
@@ -0,0 +1,154 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+#[macro_use]
+extern crate serde_derive;
+
+pub mod types;
+use types::Tensor;
+mod runtime;
+
+use getopts::Options;
+use image::{FilterType, GenericImageView};
+use ndarray::Array;
+use std::{collections::HashMap, env, fs::File, io::BufReader};
+
+const IMG_HEIGHT: usize = 224;
+const IMG_WIDTH: usize = 224;
+
+fn print_usage(program: &str, opts: Options) {
+    let brief = format!("Usage: {} [options]", program);
+    print!("{}", opts.usage(&brief));
+}
+
+fn main() {
+    let args: Vec<String> = env::args().collect();
+    let program = args[0].clone();
+
+    let mut opts = Options::new();
+    opts.optopt(
+        "g",
+        "wasm-graph-file",
+        "set the path to wasm graph file",
+        "FILE_PATH",
+    );
+    opts.optopt(
+        "i",
+        "input-data-file",
+        "set the path to input image file",
+        "FILE_PATH",
+    );
+    opts.optopt(
+        "l",
+        "label-class-file",
+        "set the path to label class file",
+        "FILE_PATH",
+    );
+    opts.optflag("h", "help", "print this help menu");
+    let matches = match opts.parse(&args[1..]) {
+        Ok(m) => m,
+        Err(f) => panic!(f.to_string()),
+    };
+    if matches.opt_present("h") {
+        print_usage(&program, opts);
+        return;
+    }
+    let wasm_graph_file: String = match matches.opt_str("g") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let input_data_file: String = match matches.opt_str("i") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let label_class_file: String = match matches.opt_str("l") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let img = image::open(input_data_file).unwrap();
+    let input = data_preprocess(img);
+
+    let output: Tensor = match runtime::execute(wasm_graph_file, input) {

Review comment:
       `map_err(|e| panic!("{}", e))`
   
   Have you tried running `cargo clippy` on this codebase? I think it'll give you some good suggestions

##########
File path: apps/wasm-standalone/wasm-graphruntime/src/main.rs
##########
@@ -0,0 +1,154 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+#[macro_use]
+extern crate serde_derive;
+
+pub mod types;
+use types::Tensor;
+mod runtime;
+
+use getopts::Options;
+use image::{FilterType, GenericImageView};
+use ndarray::Array;
+use std::{collections::HashMap, env, fs::File, io::BufReader};
+
+const IMG_HEIGHT: usize = 224;
+const IMG_WIDTH: usize = 224;
+
+fn print_usage(program: &str, opts: Options) {
+    let brief = format!("Usage: {} [options]", program);
+    print!("{}", opts.usage(&brief));
+}
+
+fn main() {
+    let args: Vec<String> = env::args().collect();
+    let program = args[0].clone();
+
+    let mut opts = Options::new();
+    opts.optopt(
+        "g",
+        "wasm-graph-file",
+        "set the path to wasm graph file",
+        "FILE_PATH",
+    );
+    opts.optopt(
+        "i",
+        "input-data-file",
+        "set the path to input image file",
+        "FILE_PATH",
+    );
+    opts.optopt(
+        "l",
+        "label-class-file",
+        "set the path to label class file",
+        "FILE_PATH",
+    );
+    opts.optflag("h", "help", "print this help menu");
+    let matches = match opts.parse(&args[1..]) {
+        Ok(m) => m,
+        Err(f) => panic!(f.to_string()),
+    };
+    if matches.opt_present("h") {
+        print_usage(&program, opts);
+        return;
+    }
+    let wasm_graph_file: String = match matches.opt_str("g") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let input_data_file: String = match matches.opt_str("i") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let label_class_file: String = match matches.opt_str("l") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let img = image::open(input_data_file).unwrap();
+    let input = data_preprocess(img);
+
+    let output: Tensor = match runtime::execute(wasm_graph_file, input) {
+        Ok(m) => m,
+        Err(f) => panic!(f.to_string()),
+    };
+    output_assert(output, label_class_file);
+}
+
+fn data_preprocess(img: image::DynamicImage) -> Tensor {
+    println!("original image dimensions: {:?}", img.dimensions());
+    let img = img
+        .resize_exact(IMG_HEIGHT as u32, IMG_WIDTH as u32, FilterType::Nearest)
+        .to_rgb();
+    println!("resized image dimensions: {:?}", img.dimensions());
+    let mut pixels: Vec<f32> = vec![];
+    for pixel in img.pixels() {
+        let tmp = pixel.data;
+        // normalize the RGB channels using mean, std of imagenet1k
+        let tmp = [
+            (tmp[0] as f32 - 123.0) / 58.395, // R
+            (tmp[1] as f32 - 117.0) / 57.12,  // G
+            (tmp[2] as f32 - 104.0) / 57.375, // B
+        ];
+        for e in &tmp {
+            pixels.push(*e);
+        }
+    }
+
+    // (H,W,C) -> (C,H,W)
+    let arr = Array::from_shape_vec((IMG_HEIGHT, IMG_WIDTH, 3), pixels).unwrap();
+    let arr = arr.permuted_axes([2, 0, 1]);
+    let arr = Array::from_iter(arr.into_iter().map(|&v| v));
+
+    return Tensor::from(arr);
+}
+
+fn output_assert(out_tensor: Tensor, label_class_file: String) {
+    let output = out_tensor.to_vec::<f32>();
+
+    // Find the maximum entry in the output and its index.
+    let mut argmax = -1;
+    let mut max_prob = 0.;
+    for i in 0..output.len() {

Review comment:
       alternatively, `output.iter().enumerate().max_by_key(|(i, v)| ...)`, if you're feeling functional

##########
File path: apps/wasm-standalone/wasm-graphruntime/src/runtime.rs
##########
@@ -0,0 +1,74 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use super::types::Tensor;
+use anyhow::Result;
+use serde_json;
+use wasmtime::*;
+use wasmtime_wasi::{Wasi, WasiCtx};
+
+pub fn execute(wasm_backend_file: String, input_data: Tensor) -> Result<Tensor> {
+    let engine = Engine::new(Config::new().wasm_simd(true));
+    let store = Store::new(&engine);
+
+    // First set up our linker which is going to be linking modules together. We
+    // want our linker to have wasi available, so we set that up here as well.
+    let mut linker = Linker::new(&store);
+    // Create an instance of `Wasi` which contains a `WasiCtx`. Note that
+    // `WasiCtx` provides a number of ways to configure what the target program
+    // will have access to.
+    let wasi = Wasi::new(&store, WasiCtx::new(std::env::args())?);
+    wasi.add_to_linker(&mut linker)?;
+
+    let module = Module::from_file(&store, &wasm_backend_file)?;
+    let instance = linker.instantiate(&module)?;
+    let memory = instance
+        .get_memory("memory")
+        .ok_or(anyhow::format_err!("failed to find `memory` export"))?;

Review comment:
       why not add `format_err` to the `use anyhow`?

##########
File path: apps/wasm-standalone/wasm-graphruntime/src/main.rs
##########
@@ -0,0 +1,154 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+#[macro_use]
+extern crate serde_derive;
+
+pub mod types;
+use types::Tensor;
+mod runtime;
+
+use getopts::Options;
+use image::{FilterType, GenericImageView};
+use ndarray::Array;
+use std::{collections::HashMap, env, fs::File, io::BufReader};
+
+const IMG_HEIGHT: usize = 224;
+const IMG_WIDTH: usize = 224;
+
+fn print_usage(program: &str, opts: Options) {
+    let brief = format!("Usage: {} [options]", program);
+    print!("{}", opts.usage(&brief));
+}
+
+fn main() {
+    let args: Vec<String> = env::args().collect();
+    let program = args[0].clone();
+
+    let mut opts = Options::new();
+    opts.optopt(
+        "g",
+        "wasm-graph-file",
+        "set the path to wasm graph file",
+        "FILE_PATH",
+    );
+    opts.optopt(
+        "i",
+        "input-data-file",
+        "set the path to input image file",
+        "FILE_PATH",
+    );
+    opts.optopt(
+        "l",
+        "label-class-file",
+        "set the path to label class file",
+        "FILE_PATH",
+    );
+    opts.optflag("h", "help", "print this help menu");
+    let matches = match opts.parse(&args[1..]) {
+        Ok(m) => m,
+        Err(f) => panic!(f.to_string()),
+    };
+    if matches.opt_present("h") {
+        print_usage(&program, opts);
+        return;
+    }
+    let wasm_graph_file: String = match matches.opt_str("g") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let input_data_file: String = match matches.opt_str("i") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let label_class_file: String = match matches.opt_str("l") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let img = image::open(input_data_file).unwrap();
+    let input = data_preprocess(img);
+
+    let output: Tensor = match runtime::execute(wasm_graph_file, input) {
+        Ok(m) => m,
+        Err(f) => panic!(f.to_string()),
+    };
+    output_assert(output, label_class_file);
+}
+
+fn data_preprocess(img: image::DynamicImage) -> Tensor {
+    println!("original image dimensions: {:?}", img.dimensions());
+    let img = img
+        .resize_exact(IMG_HEIGHT as u32, IMG_WIDTH as u32, FilterType::Nearest)
+        .to_rgb();
+    println!("resized image dimensions: {:?}", img.dimensions());
+    let mut pixels: Vec<f32> = vec![];
+    for pixel in img.pixels() {
+        let tmp = pixel.data;
+        // normalize the RGB channels using mean, std of imagenet1k
+        let tmp = [
+            (tmp[0] as f32 - 123.0) / 58.395, // R
+            (tmp[1] as f32 - 117.0) / 57.12,  // G
+            (tmp[2] as f32 - 104.0) / 57.375, // B
+        ];
+        for e in &tmp {
+            pixels.push(*e);
+        }
+    }
+
+    // (H,W,C) -> (C,H,W)
+    let arr = Array::from_shape_vec((IMG_HEIGHT, IMG_WIDTH, 3), pixels).unwrap();
+    let arr = arr.permuted_axes([2, 0, 1]);
+    let arr = Array::from_iter(arr.into_iter().map(|&v| v));
+
+    return Tensor::from(arr);
+}
+
+fn output_assert(out_tensor: Tensor, label_class_file: String) {
+    let output = out_tensor.to_vec::<f32>();
+
+    // Find the maximum entry in the output and its index.
+    let mut argmax = -1;
+    let mut max_prob = 0.;
+    for i in 0..output.len() {
+        if output[i] > max_prob {
+            max_prob = output[i];
+            argmax = i as i32;
+        }
+    }
+
+    // Create a hash map of (class id, class name)
+    let mut synset: HashMap<i32, String> = HashMap::new();

Review comment:
       strictly speaking, the map is dense in the key set, so you could use a `Vec` if you had the length beforehand. Minor memory saving. I wouldn't bother with it.

##########
File path: apps/wasm-standalone/wasm-graphruntime/src/runtime.rs
##########
@@ -0,0 +1,74 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use super::types::Tensor;
+use anyhow::Result;
+use serde_json;
+use wasmtime::*;
+use wasmtime_wasi::{Wasi, WasiCtx};
+
+pub fn execute(wasm_backend_file: String, input_data: Tensor) -> Result<Tensor> {
+    let engine = Engine::new(Config::new().wasm_simd(true));
+    let store = Store::new(&engine);
+
+    // First set up our linker which is going to be linking modules together. We
+    // want our linker to have wasi available, so we set that up here as well.
+    let mut linker = Linker::new(&store);
+    // Create an instance of `Wasi` which contains a `WasiCtx`. Note that
+    // `WasiCtx` provides a number of ways to configure what the target program
+    // will have access to.
+    let wasi = Wasi::new(&store, WasiCtx::new(std::env::args())?);
+    wasi.add_to_linker(&mut linker)?;
+
+    let module = Module::from_file(&store, &wasm_backend_file)?;
+    let instance = linker.instantiate(&module)?;
+    let memory = instance
+        .get_memory("memory")
+        .ok_or(anyhow::format_err!("failed to find `memory` export"))?;
+
+    // Specify the wasm address to access the wasm memory.
+    let wasm_addr = memory.data_size();
+    // Serialize the data into a JSON string.
+    let in_data = serde_json::to_vec(&input_data)?;
+    let in_size = in_data.len();
+    // Grow up memory size according to in_size to avoid memory leak.
+    memory.grow((in_size >> 16) as u32 + 1)?;
+
+    // Insert the input data into wasm memory.
+    for i in 0..in_size {
+        unsafe {
+            memory.data_unchecked_mut()[wasm_addr + i] = *in_data.get(i).unwrap();
+        }
+    }
+
+    // Invoke `run` export.
+    let run = instance
+        .get_func("run")
+        .ok_or(anyhow::format_err!("failed to find `run` function export!"))?

Review comment:
       clippy will ask you to make this an `ok_or_else`




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] jroesch commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
jroesch commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-660317393


   I have an open PR https://github.com/apache/incubator-tvm/pull/6011 that is trying to move all Rust code to the updated bindings it might make sense to land this against the new bindings, what do you think? the changes should be very minimal. I'm hoping to land that branch today or Monday. 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r453150111



##########
File path: apps/wasm-graphcompiler-tvm/README.md
##########
@@ -0,0 +1,191 @@
+# WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime
+

Review comment:
       Sure




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] tqchen merged pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
tqchen merged pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r461252642



##########
File path: apps/wasm-standalone/wasm-graph/src/types.rs
##########
@@ -0,0 +1,182 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use std::{
+    any::TypeId,
+    os::raw::{c_int, c_void},
+    slice,
+};
+pub use tvm_sys::ffi::DLTensor;
+use tvm_sys::ffi::{
+    DLContext, DLDataType, DLDataTypeCode_kDLFloat, DLDataTypeCode_kDLInt, DLDeviceType_kDLCPU,
+};
+
+#[derive(Debug, PartialEq, Clone, Serialize, Deserialize)]
+pub enum DataType {

Review comment:
       @jroesch Yes, defintely. This duplicated structures (`Tensor` and `DataType`) would eventually use the official one in rust runtime instead. I would be happy to create a follow up PR to fix it.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-656543313


   > please remove the csv file as we cannot checkin binary to the codebase
   
   @tqchen `Done`


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-660471166


   Thanks @jroesch for the quick review, actually this PR is tightly binded with the TVM rust runtime, but currently I have to define a third-party `Tensor` struct to meet the demand of serialization, and I even modified one of its private member methods. I'm not sure what would be the best option to address this problem, any thoughts?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] tqchen commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
tqchen commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r452946065



##########
File path: apps/wasm-graphcompiler-tvm/README.md
##########
@@ -0,0 +1,188 @@
+# WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime
+
+#### Experimental notice: This project is still *experimental* and only serves as a proof of concept for running deep learning frameworks (such like [MindSpore](https://github.com/mindspore-ai/mindspore)) on [WebAssembly runtime](https://github.com/bytecodealliance/wasmtime) with [TVM stack](https://tvm.apache.org/).
+
+- [WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime](#webassembly-graphcompiler-for-deep-learning-framework-with-tvm-runtime)
+    - [Motivation](#motivation)
+    - [Framework Landscape](#framework-landscape)
+    - [Project Status](#project-status)
+    - [PoC Guidelines](#poc-guidelines)
+        - [Pre-installation](#pre-installation)
+        - [Build ResNet50 model](#build-resnet50-model)
+        - [Build wasm-graphcompiler-tvm package](#build-wasm-graphcompiler-tvm-package)
+        - [Test](#test)
+    - [Future Work](#future-work)
+        - [More networks support](#more-networks-support)
+        - [Performance benchmark](#performance-benchmark)
+        - [Native TVM Rust runtime support](#native-tvm-rust-runtime-support)
+    - [Appendix](#appendix)
+        - [System packages install](#system-packages-install)
+    - [Contribution](#contribution)
+
+## Motivation
+
+<img src="https://github.com/dmlc/web-data/raw/master/tvm/tutorial/tvm_support_list.png" alt="TVM hardware support" width="600"/>
+
+As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized graph compiler for all deep learning frameworks.
+
+## Framework Landscape
+
+The figures below demonstrate the whole landscape of running deep learning frameworks on WASM runtime with TVM compiler stack.
+
+* WASM graph compiler stack
+    ```
+       _ _ _ _ _ _ _ _ _ _        _ _ _ _ _ _ _        _ _ _ _ _ _ _ _ _ _ _ _
+      |                   |      |             |      |                       |
+      |  Framework Model  | ---> |  ONNX Model | ---> |  TVM Relay Python API |
+      |_ _ _ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _ _ _|
+                                                                 ||
+                                                                 \/
+                 _ _ _ _ _ _ _ _ _ _ _                  _ _ _ _ _ _ _ _ _ _ _
+                |                     |                |                     |
+                | WASM Graph Compiler |                |  TVM Compiler Stack |
+                |    (TVM runtime)    |                |_ _ _ _ _ _ _ _ _ _ _|

Review comment:
       I see, perhapsit we should rename it as the app code? Since we are writing app using tvm rust runtime

##########
File path: apps/wasm-graphcompiler-tvm/wasm-graphcompiler/Cargo.toml
##########
@@ -0,0 +1,26 @@
+[package]
+name = "wasm-graphcompiler-tvm"
+version = "0.1.0"
+authors = ["leonwanghui <wa...@gmail.com>"]

Review comment:
       let us rename authors to TVM contributors, the rationale is that the same code will be contributed by multiple contributors, and the contribution is already recored by the commit history.

##########
File path: apps/wasm-graphcompiler-tvm/README.md
##########
@@ -0,0 +1,191 @@
+# WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime
+
+#### Experimental notice: This project is still *experimental* and only serves as a proof of concept for running deep learning frameworks (such like [MindSpore](https://github.com/mindspore-ai/mindspore)) on [WebAssembly runtime](https://github.com/bytecodealliance/wasmtime) with [TVM stack](https://tvm.apache.org/).
+
+- [WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime](#webassembly-graphcompiler-for-deep-learning-framework-with-tvm-runtime)
+    - [Motivation](#motivation)
+    - [Framework Landscape](#framework-landscape)
+    - [Project Status](#project-status)
+    - [PoC Guidelines](#poc-guidelines)
+        - [Pre-installation](#pre-installation)
+        - [Build ResNet50 model](#build-resnet50-model)
+        - [Build wasm-graphcompiler-tvm package](#build-wasm-graphcompiler-tvm-package)
+        - [Test](#test)
+    - [Future Work](#future-work)
+        - [More networks support](#more-networks-support)
+        - [Performance benchmark](#performance-benchmark)
+        - [Native TVM Rust runtime support](#native-tvm-rust-runtime-support)
+    - [Appendix](#appendix)
+        - [System packages install](#system-packages-install)
+    - [Contribution](#contribution)
+
+## Motivation
+
+<img src="https://github.com/dmlc/web-data/raw/master/tvm/tutorial/tvm_support_list.png" alt="TVM hardware support" width="600"/>
+
+As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized graph compiler for all deep learning frameworks.
+
+## Framework Landscape
+
+The figures below demonstrate the whole landscape of running deep learning frameworks on WASM runtime with TVM compiler stack.
+
+* WASM graph compiler stack
+    ```
+       _ _ _ _ _ _ _ _ _ _        _ _ _ _ _ _ _        _ _ _ _ _ _ _ _ _ _ _ _
+      |                   |      |             |      |                       |
+      |  Framework Model  | ---> |  ONNX Model | ---> |  TVM Relay Python API |
+      |_ _ _ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _ _ _|
+                                                                 ||
+                                                                 \/
+                 _ _ _ _ _ _ _ _ _ _ _                  _ _ _ _ _ _ _ _ _ _ _
+                |                     |                |                     |
+                | WASM Graph Compiler |                |  TVM Compiler Stack |
+                |    (TVM runtime)    |                |_ _ _ _ _ _ _ _ _ _ _|
+                |_ _ _ _ _ _ _ _ _ _ _|                          ||
+                          ||                                     \/
+        _ _ _ _ _ _ _ _   ||   _ _ _ _ _ _ _ _ _ _            _ _ _ _ _
+       |               |  \/  |                   |  llvm-ar |         |
+       | *.graph.wasm  | <--- | libgraph_wasm32.a | <------- | graph.o |
+       |_ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _|          |_ _ _ _ _|
+    ```
+
+* WASM graph runtime
+    ```
+         _ _ _ _ _ _ _ _ _ _ _
+        |                     |
+        | WASM Graph Runtime  |
+        |   (WASM runtime)    |
+        |_ _ _ _ _ _ _ _ _ _ _|
+                  ||
+           _ _ _ _\/_ _ _ _
+          |                |
+          |  *.graph.wasm  |
+          |_ _ _ _ _ _ _ _ |
+    ```
+
+## Project Status
+
+This project should be considered **experimental** at the very early stage, all rich features are under active development. Here is the current operator support matrix:
+
+| Model Name | Status |
+| ---------- | ------ |
+| ResNet50 | ✔️ |
+| LeNet | <center>&mdash;</center> |
+
+**NOTICE**: Currently this project is ONLY tested on Ubuntu system, so `Ubuntu 16.04+` should be prepared as the testing environment.
+
+## PoC Guidelines
+
+### Pre-installation
+
+* Rust
+
+    Before running this demo, please make sure [Rust](#system-packages-install) has been installed.
+
+    After Rust installed, execute the code below to add `wasm32-wasi` target:
+    ```shell
+    rustup target add wasm32-wasi
+    ```
+
+* TVM
+
+    Please follow TVM [installations](https://tvm.apache.org/docs/install/index.html), `export TVM_HOME=/path/to/tvm` and add `libtvm_runtime` to your `LD_LIBRARY_PATH`.

Review comment:
       Skip TVM part as there are instructions to do so

##########
File path: apps/wasm-graphcompiler-tvm/README.md
##########
@@ -0,0 +1,191 @@
+# WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime
+

Review comment:
       Shall we rename it to wasm-standalone? since what we are doing is compiling to a standalone wasm application

##########
File path: apps/wasm-graphcompiler-tvm/wasm-graphcompiler/Cargo.toml
##########
@@ -0,0 +1,26 @@
+[package]
+name = "wasm-graphcompiler-tvm"

Review comment:
       wasm-standalone-example

##########
File path: apps/wasm-graphcompiler-tvm/README.md
##########
@@ -0,0 +1,191 @@
+# WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime
+
+#### Experimental notice: This project is still *experimental* and only serves as a proof of concept for running deep learning frameworks (such like [MindSpore](https://github.com/mindspore-ai/mindspore)) on [WebAssembly runtime](https://github.com/bytecodealliance/wasmtime) with [TVM stack](https://tvm.apache.org/).

Review comment:
       I am not sure mindspore is relevant. If there are examples to interface with MS later, we could add a link to that example later in the README




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] jroesch commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
jroesch commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r461224729



##########
File path: apps/wasm-standalone/wasm-graph/src/types.rs
##########
@@ -0,0 +1,182 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use std::{
+    any::TypeId,
+    os::raw::{c_int, c_void},
+    slice,
+};
+pub use tvm_sys::ffi::DLTensor;
+use tvm_sys::ffi::{
+    DLContext, DLDataType, DLDataTypeCode_kDLFloat, DLDataTypeCode_kDLInt, DLDeviceType_kDLCPU,
+};
+
+#[derive(Debug, PartialEq, Clone, Serialize, Deserialize)]
+pub enum DataType {

Review comment:
       I'm happy to have this done in follow up PR, but is it possible to use the `tvm-sys` types instead of redefining these again. My goal is that those types should have all standard behavior and we don't need to define duplicates outside of the crate. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-659758783


   cc @tqchen 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r457968680



##########
File path: apps/wasm-standalone/wasm-graphruntime/src/main.rs
##########
@@ -0,0 +1,154 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+#[macro_use]
+extern crate serde_derive;
+
+pub mod types;
+use types::Tensor;
+mod runtime;
+
+use getopts::Options;
+use image::{FilterType, GenericImageView};
+use ndarray::Array;
+use std::{collections::HashMap, env, fs::File, io::BufReader};
+
+const IMG_HEIGHT: usize = 224;
+const IMG_WIDTH: usize = 224;
+
+fn print_usage(program: &str, opts: Options) {
+    let brief = format!("Usage: {} [options]", program);
+    print!("{}", opts.usage(&brief));
+}
+
+fn main() {
+    let args: Vec<String> = env::args().collect();
+    let program = args[0].clone();
+
+    let mut opts = Options::new();
+    opts.optopt(
+        "g",
+        "wasm-graph-file",
+        "set the path to wasm graph file",
+        "FILE_PATH",
+    );
+    opts.optopt(
+        "i",
+        "input-data-file",
+        "set the path to input image file",
+        "FILE_PATH",
+    );
+    opts.optopt(
+        "l",
+        "label-class-file",
+        "set the path to label class file",
+        "FILE_PATH",
+    );
+    opts.optflag("h", "help", "print this help menu");
+    let matches = match opts.parse(&args[1..]) {
+        Ok(m) => m,
+        Err(f) => panic!(f.to_string()),
+    };
+    if matches.opt_present("h") {
+        print_usage(&program, opts);
+        return;
+    }
+    let wasm_graph_file: String = match matches.opt_str("g") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let input_data_file: String = match matches.opt_str("i") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let label_class_file: String = match matches.opt_str("l") {
+        Some(s) => s,
+        None => String::from(""),
+    };
+    let img = image::open(input_data_file).unwrap();
+    let input = data_preprocess(img);
+
+    let output: Tensor = match runtime::execute(wasm_graph_file, input) {
+        Ok(m) => m,
+        Err(f) => panic!(f.to_string()),
+    };
+    output_assert(output, label_class_file);
+}
+
+fn data_preprocess(img: image::DynamicImage) -> Tensor {
+    println!("original image dimensions: {:?}", img.dimensions());
+    let img = img
+        .resize_exact(IMG_HEIGHT as u32, IMG_WIDTH as u32, FilterType::Nearest)
+        .to_rgb();
+    println!("resized image dimensions: {:?}", img.dimensions());
+    let mut pixels: Vec<f32> = vec![];
+    for pixel in img.pixels() {
+        let tmp = pixel.data;
+        // normalize the RGB channels using mean, std of imagenet1k
+        let tmp = [
+            (tmp[0] as f32 - 123.0) / 58.395, // R
+            (tmp[1] as f32 - 117.0) / 57.12,  // G
+            (tmp[2] as f32 - 104.0) / 57.375, // B
+        ];
+        for e in &tmp {
+            pixels.push(*e);
+        }
+    }
+
+    // (H,W,C) -> (C,H,W)
+    let arr = Array::from_shape_vec((IMG_HEIGHT, IMG_WIDTH, 3), pixels).unwrap();
+    let arr = arr.permuted_axes([2, 0, 1]);
+    let arr = Array::from_iter(arr.into_iter().map(|&v| v));
+
+    return Tensor::from(arr);
+}
+
+fn output_assert(out_tensor: Tensor, label_class_file: String) {
+    let output = out_tensor.to_vec::<f32>();
+
+    // Find the maximum entry in the output and its index.
+    let mut argmax = -1;
+    let mut max_prob = 0.;
+    for i in 0..output.len() {
+        if output[i] > max_prob {
+            max_prob = output[i];
+            argmax = i as i32;
+        }
+    }
+
+    // Create a hash map of (class id, class name)
+    let mut synset: HashMap<i32, String> = HashMap::new();

Review comment:
       Got it, would update it in next iteration phase : )




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui edited a comment on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui edited a comment on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-659108095


   cc @kazum I have resolved your comments, PTAL


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r459808855



##########
File path: apps/wasm-standalone/wasm-graph/src/utils.rs
##########
@@ -0,0 +1,48 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use super::types::*;
+use serde_json;
+use std::ptr;
+
+pub fn load_input(in_addr: i32, in_size: usize) -> Tensor {

Review comment:
       Got it, thanks for reminding : )




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] tqchen commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
tqchen commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-663691736


   @leonwanghui please send another commit to trigger the CI, @jroesch @nhynes please followup and take another look


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-657903409


   cc @tqchen Please review it again, thanks!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r453387829



##########
File path: apps/wasm-graphcompiler-tvm/README.md
##########
@@ -0,0 +1,191 @@
+# WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime
+
+#### Experimental notice: This project is still *experimental* and only serves as a proof of concept for running deep learning frameworks (such like [MindSpore](https://github.com/mindspore-ai/mindspore)) on [WebAssembly runtime](https://github.com/bytecodealliance/wasmtime) with [TVM stack](https://tvm.apache.org/).
+
+- [WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime](#webassembly-graphcompiler-for-deep-learning-framework-with-tvm-runtime)
+    - [Motivation](#motivation)
+    - [Framework Landscape](#framework-landscape)
+    - [Project Status](#project-status)
+    - [PoC Guidelines](#poc-guidelines)
+        - [Pre-installation](#pre-installation)
+        - [Build ResNet50 model](#build-resnet50-model)
+        - [Build wasm-graphcompiler-tvm package](#build-wasm-graphcompiler-tvm-package)
+        - [Test](#test)
+    - [Future Work](#future-work)
+        - [More networks support](#more-networks-support)
+        - [Performance benchmark](#performance-benchmark)
+        - [Native TVM Rust runtime support](#native-tvm-rust-runtime-support)
+    - [Appendix](#appendix)
+        - [System packages install](#system-packages-install)
+    - [Contribution](#contribution)
+
+## Motivation
+
+<img src="https://github.com/dmlc/web-data/raw/master/tvm/tutorial/tvm_support_list.png" alt="TVM hardware support" width="600"/>
+
+As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized graph compiler for all deep learning frameworks.
+
+## Framework Landscape
+
+The figures below demonstrate the whole landscape of running deep learning frameworks on WASM runtime with TVM compiler stack.
+
+* WASM graph compiler stack
+    ```
+       _ _ _ _ _ _ _ _ _ _        _ _ _ _ _ _ _        _ _ _ _ _ _ _ _ _ _ _ _
+      |                   |      |             |      |                       |
+      |  Framework Model  | ---> |  ONNX Model | ---> |  TVM Relay Python API |
+      |_ _ _ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _ _ _|
+                                                                 ||
+                                                                 \/
+                 _ _ _ _ _ _ _ _ _ _ _                  _ _ _ _ _ _ _ _ _ _ _
+                |                     |                |                     |
+                | WASM Graph Compiler |                |  TVM Compiler Stack |
+                |    (TVM runtime)    |                |_ _ _ _ _ _ _ _ _ _ _|
+                |_ _ _ _ _ _ _ _ _ _ _|                          ||
+                          ||                                     \/
+        _ _ _ _ _ _ _ _   ||   _ _ _ _ _ _ _ _ _ _            _ _ _ _ _
+       |               |  \/  |                   |  llvm-ar |         |
+       | *.graph.wasm  | <--- | libgraph_wasm32.a | <------- | graph.o |
+       |_ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _|          |_ _ _ _ _|
+    ```
+
+* WASM graph runtime
+    ```
+         _ _ _ _ _ _ _ _ _ _ _
+        |                     |
+        | WASM Graph Runtime  |
+        |   (WASM runtime)    |
+        |_ _ _ _ _ _ _ _ _ _ _|
+                  ||
+           _ _ _ _\/_ _ _ _
+          |                |
+          |  *.graph.wasm  |
+          |_ _ _ _ _ _ _ _ |
+    ```
+
+## Project Status
+
+This project should be considered **experimental** at the very early stage, all rich features are under active development. Here is the current operator support matrix:
+
+| Model Name | Status |
+| ---------- | ------ |
+| ResNet50 | ✔️ |
+| LeNet | <center>&mdash;</center> |
+
+**NOTICE**: Currently this project is ONLY tested on Ubuntu system, so `Ubuntu 16.04+` should be prepared as the testing environment.
+
+## PoC Guidelines
+
+### Pre-installation
+
+* Rust
+
+    Before running this demo, please make sure [Rust](#system-packages-install) has been installed.
+
+    After Rust installed, execute the code below to add `wasm32-wasi` target:
+    ```shell
+    rustup target add wasm32-wasi
+    ```
+
+* TVM
+
+    Please follow TVM [installations](https://tvm.apache.org/docs/install/index.html), `export TVM_HOME=/path/to/tvm` and add `libtvm_runtime` to your `LD_LIBRARY_PATH`.

Review comment:
       Got it




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] tqchen commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
tqchen commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-657946635


   @leonwanghui it seems that we might want to include .cargo/config, can you modify `https://github.com/apache/incubator-tvm/blob/master/tests/lint/check_file_type.py#L110`  to enable that?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r457875722



##########
File path: apps/wasm-standalone/wasm-graphruntime/src/runtime.rs
##########
@@ -0,0 +1,74 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use super::types::Tensor;
+use anyhow::Result;
+use serde_json;
+use wasmtime::*;
+use wasmtime_wasi::{Wasi, WasiCtx};
+
+pub fn execute(wasm_backend_file: String, input_data: Tensor) -> Result<Tensor> {
+    let engine = Engine::new(Config::new().wasm_simd(true));
+    let store = Store::new(&engine);
+
+    // First set up our linker which is going to be linking modules together. We
+    // want our linker to have wasi available, so we set that up here as well.
+    let mut linker = Linker::new(&store);
+    // Create an instance of `Wasi` which contains a `WasiCtx`. Note that
+    // `WasiCtx` provides a number of ways to configure what the target program
+    // will have access to.
+    let wasi = Wasi::new(&store, WasiCtx::new(std::env::args())?);
+    wasi.add_to_linker(&mut linker)?;
+
+    let module = Module::from_file(&store, &wasm_backend_file)?;
+    let instance = linker.instantiate(&module)?;
+    let memory = instance
+        .get_memory("memory")
+        .ok_or(anyhow::format_err!("failed to find `memory` export"))?;

Review comment:
       @nhynes It's suggested by wasmtime official guidelines (https://bytecodealliance.github.io/wasmtime/examples-rust-embed.html).




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] kazum commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
kazum commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r455311346



##########
File path: apps/wasm-standalone/README.md
##########
@@ -0,0 +1,207 @@
+<!--- Licensed to the Apache Software Foundation (ASF) under one -->
+<!--- or more contributor license agreements.  See the NOTICE file -->
+<!--- distributed with this work for additional information -->
+<!--- regarding copyright ownership.  The ASF licenses this file -->
+<!--- to you under the Apache License, Version 2.0 (the -->
+<!--- "License"); you may not use this file except in compliance -->
+<!--- with the License.  You may obtain a copy of the License at -->
+
+<!---   http://www.apache.org/licenses/LICENSE-2.0 -->
+
+<!--- Unless required by applicable law or agreed to in writing, -->
+<!--- software distributed under the License is distributed on an -->
+<!--- "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -->
+<!--- KIND, either express or implied.  See the License for the -->
+<!--- specific language governing permissions and limitations -->
+<!--- under the License. -->
+
+# WebAssembly Standalone for Deep Learning Framework with TVM Runtime
+
+#### Experimental notice: This project is still *experimental* and only serves as a proof of concept for running deep learning frameworks on [WebAssembly runtime](https://github.com/bytecodealliance/wasmtime) with [TVM stack](https://tvm.apache.org/).
+
+- [WebAssembly Standalone for Deep Learning Framework with TVM Runtime](#webassembly-standalone-for-deep-learning-framework-with-tvm-runtime)
+    - [Motivation](#motivation)
+    - [Framework Landscape](#framework-landscape)
+    - [Project Status](#project-status)
+    - [PoC Guidelines](#poc-guidelines)
+        - [Pre-installation](#pre-installation)
+        - [Build ResNet50 model](#build-resnet50-model)
+        - [Build wasm-graph package](#build-wasm-graph-package)
+        - [Test](#test)
+    - [Future Work](#future-work)
+        - [More networks support](#more-networks-support)
+        - [Performance benchmark](#performance-benchmark)
+        - [Native TVM Rust runtime support](#native-tvm-rust-runtime-support)
+    - [Appendix](#appendix)
+        - [System packages install](#system-packages-install)
+    - [Contribution](#contribution)
+
+## Motivation
+
+<img src="https://github.com/dmlc/web-data/raw/master/tvm/tutorial/tvm_support_list.png" alt="TVM hardware support" width="600"/>
+
+As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized graph compiler for all deep learning frameworks.
+
+## Framework Landscape
+
+The figures below demonstrate the whole landscape of running deep learning frameworks on WASM runtime with TVM compiler stack.
+
+* WASM graph generation
+    ```
+       _ _ _ _ _ _ _ _ _ _        _ _ _ _ _ _ _        _ _ _ _ _ _ _ _ _ _ _ _
+      |                   |      |             |      |                       |
+      |  Framework Model  | ---> |  ONNX Model | ---> |  TVM Relay Python API |
+      |_ _ _ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _ _ _|
+                                                                 ||
+                                                                 \/
+                 _ _ _ _ _ _ _ _ _ _ _                  _ _ _ _ _ _ _ _ _ _ _
+                |                     |                |                     |
+                | WASM Graph AppCode  |                |  TVM Compiler Stack |
+                |    (TVM runtime)    |                |_ _ _ _ _ _ _ _ _ _ _|
+                |_ _ _ _ _ _ _ _ _ _ _|                          ||
+                          ||                                     \/
+      _ _ _ _ _ _ _ _ _   ||   _ _ _ _ _ _ _ _ _ _            _ _ _ _ _
+     |                 |  \/  |                   |  llvm-ar |         |
+     | wasm_graph.wasm | <--- | libgraph_wasm32.a | <------- | graph.o |
+     |_ _ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _|          |_ _ _ _ _|
+    ```
+
+* WASM graph loading
+    ```
+         _ _ _ _ _ _ _ _ _ _ _
+        |                     |
+        | WASM Graph AppCode  |
+        |   (WASM runtime)    |
+        |_ _ _ _ _ _ _ _ _ _ _|
+                  ||
+                  \/
+          _ _ _ _ _ _ _ _ _ _
+         |                   |
+         |  wasm_graph.wasm  |
+         |_ _ _ _ _ _ _ _ _ _|
+    ```
+
+## Project Status
+
+This project should be considered **experimental** at the very early stage, all rich features are under active development. Here is the current operator support matrix:
+
+| Model Name | Status |
+| ---------- | ------ |
+| ResNet50 | ✔️ |
+| LeNet | <center>&mdash;</center> |
+
+**NOTICE**: Currently this project is ONLY tested on Ubuntu system, so `Ubuntu 16.04+` should be prepared as the testing environment.
+
+## PoC Guidelines
+
+### Pre-installation
+
+* Rust
+
+    Before running this demo, please make sure [Rust](#system-packages-install) has been installed.
+
+    After Rust installed, execute the code below to add `wasm32-wasi` target:
+    ```shell
+    rustup target add wasm32-wasi
+    ```
+
+* TVM
+
+    Please follow TVM [installations](https://tvm.apache.org/docs/install/index.html) for the detailed instruction.
+
+* LLVM
+
+    `LLVM 10.0` or later is REQUIRED.
+
+### Build ResNet50 model
+
+- Build DL library in the WebAssembly format.
+
+  - Download model
+
+    ```
+    cd wasm-graphcompiler/tools && wget https://s3.amazonaws.com/onnx-model-zoo/resnet/resnet50v1/resnet50v1.onnx

Review comment:
       wasm-graphcompiler -> wasm-graph

##########
File path: apps/wasm-standalone/wasm-graphruntime/src/main.rs
##########
@@ -0,0 +1,154 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+#[macro_use]
+extern crate serde_derive;
+
+pub mod types;
+use types::Tensor;
+mod runtime;
+
+use getopts::Options;
+use image::{FilterType, GenericImageView};
+use ndarray::Array;
+use std::{collections::HashMap, env, fs::File, io::BufReader};
+
+const IMG_HEIGHT: usize = 224;
+const IMG_WIDTH: usize = 224;
+
+fn print_usage(program: &str, opts: Options) {
+    let brief = format!("Usage: {} [options]", program);
+    print!("{}", opts.usage(&brief));
+}
+
+fn main() {
+    let args: Vec<String> = env::args().collect();
+    let program = args[0].clone();
+
+    let mut opts = Options::new();
+    opts.optopt(
+        "c",
+        "ms-backend-config",

Review comment:
       Should be simply "config", or "wasm-file"?

##########
File path: apps/wasm-standalone/wasm-graph/Cargo.toml
##########
@@ -0,0 +1,43 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+[package]
+name = "wasm-graph"
+version = "0.1.0"
+authors = ["TVM Contributors"]
+edition = "2018"
+description = "WebAssembly graph to deep learning frameworks using TVM"
+readme = "README.md"
+repository = "https://github.com/apache/incubator-tvm"
+license = "Apache-2.0"
+keywords = ["wasm", "machine learning", "tvm"]
+
+[profile.release]
+lto = true
+opt-level = 's'
+
+[lib]
+crate-type = ['cdylib']
+
+[dependencies]
+serde = "1.0.53"
+serde_derive = "1.0.53"
+serde_json = "1.0.53"
+ndarray = "0.12"
+tvm-common = { version = "0.1", path = "../tvm/rust/common" }
+tvm-runtime = { version = "0.1", path = "../tvm/rust/runtime" }

Review comment:
       ../../../rust/runtime




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] tqchen edited a comment on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
tqchen edited a comment on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-647898735


   Thanks @leonwanghui  Given that TVM is a compilation stack, directly generating generic ops may not give the best performance for deployment, as the compilation approach takes a lot of shape specialization and fusions. 
   
   A better approach for integration might be creating a relay or TIR-level integration from the framework to the TVM's unified IR, then let the compilation produces a standalone module.
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r452590922



##########
File path: apps/wasm-graphcompiler-tvm/README.md
##########
@@ -0,0 +1,188 @@
+# WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime
+
+#### Experimental notice: This project is still *experimental* and only serves as a proof of concept for running deep learning frameworks (such like [MindSpore](https://github.com/mindspore-ai/mindspore)) on [WebAssembly runtime](https://github.com/bytecodealliance/wasmtime) with [TVM stack](https://tvm.apache.org/).
+
+- [WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime](#webassembly-graphcompiler-for-deep-learning-framework-with-tvm-runtime)
+    - [Motivation](#motivation)
+    - [Framework Landscape](#framework-landscape)
+    - [Project Status](#project-status)
+    - [PoC Guidelines](#poc-guidelines)
+        - [Pre-installation](#pre-installation)
+        - [Build ResNet50 model](#build-resnet50-model)
+        - [Build wasm-graphcompiler-tvm package](#build-wasm-graphcompiler-tvm-package)
+        - [Test](#test)
+    - [Future Work](#future-work)
+        - [More networks support](#more-networks-support)
+        - [Performance benchmark](#performance-benchmark)
+        - [Native TVM Rust runtime support](#native-tvm-rust-runtime-support)
+    - [Appendix](#appendix)
+        - [System packages install](#system-packages-install)
+    - [Contribution](#contribution)
+
+## Motivation
+
+<img src="https://github.com/dmlc/web-data/raw/master/tvm/tutorial/tvm_support_list.png" alt="TVM hardware support" width="600"/>
+
+As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized graph compiler for all deep learning frameworks.
+
+## Framework Landscape
+
+The figures below demonstrate the whole landscape of running deep learning frameworks on WASM runtime with TVM compiler stack.
+
+* WASM graph compiler stack
+    ```
+       _ _ _ _ _ _ _ _ _ _        _ _ _ _ _ _ _        _ _ _ _ _ _ _ _ _ _ _ _
+      |                   |      |             |      |                       |
+      |  Framework Model  | ---> |  ONNX Model | ---> |  TVM Relay Python API |
+      |_ _ _ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _ _ _|
+                                                                 ||
+                                                                 \/
+                 _ _ _ _ _ _ _ _ _ _ _                  _ _ _ _ _ _ _ _ _ _ _
+                |                     |                |                     |
+                | WASM Graph Compiler |                |  TVM Compiler Stack |
+                |    (TVM runtime)    |                |_ _ _ _ _ _ _ _ _ _ _|

Review comment:
       Please notice that the "graph compiler" module is just a wrapper of TVM rust runtime, I need to compile tvm runtime into wasm code so it can be loaded with `wasmtime`. Is that ok to you?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] tqchen commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
tqchen commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-664756299


   Thanks @leonwanghui @jroesch @nhynes @kazum  ! this PR is now merged


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-659108095


   cc @kazum I have resolved your commits, PTAL


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-656422667


   cc @tqchen @jroesch 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] tqchen commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
tqchen commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r452577924



##########
File path: apps/wasm-graphcompiler-tvm/README.md
##########
@@ -0,0 +1,188 @@
+# WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime
+
+#### Experimental notice: This project is still *experimental* and only serves as a proof of concept for running deep learning frameworks (such like [MindSpore](https://github.com/mindspore-ai/mindspore)) on [WebAssembly runtime](https://github.com/bytecodealliance/wasmtime) with [TVM stack](https://tvm.apache.org/).
+
+- [WebAssembly GraphCompiler for Deep Learning Framework with TVM Runtime](#webassembly-graphcompiler-for-deep-learning-framework-with-tvm-runtime)
+    - [Motivation](#motivation)
+    - [Framework Landscape](#framework-landscape)
+    - [Project Status](#project-status)
+    - [PoC Guidelines](#poc-guidelines)
+        - [Pre-installation](#pre-installation)
+        - [Build ResNet50 model](#build-resnet50-model)
+        - [Build wasm-graphcompiler-tvm package](#build-wasm-graphcompiler-tvm-package)
+        - [Test](#test)
+    - [Future Work](#future-work)
+        - [More networks support](#more-networks-support)
+        - [Performance benchmark](#performance-benchmark)
+        - [Native TVM Rust runtime support](#native-tvm-rust-runtime-support)
+    - [Appendix](#appendix)
+        - [System packages install](#system-packages-install)
+    - [Contribution](#contribution)
+
+## Motivation
+
+<img src="https://github.com/dmlc/web-data/raw/master/tvm/tutorial/tvm_support_list.png" alt="TVM hardware support" width="600"/>
+
+As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized graph compiler for all deep learning frameworks.
+
+## Framework Landscape
+
+The figures below demonstrate the whole landscape of running deep learning frameworks on WASM runtime with TVM compiler stack.
+
+* WASM graph compiler stack
+    ```
+       _ _ _ _ _ _ _ _ _ _        _ _ _ _ _ _ _        _ _ _ _ _ _ _ _ _ _ _ _
+      |                   |      |             |      |                       |
+      |  Framework Model  | ---> |  ONNX Model | ---> |  TVM Relay Python API |
+      |_ _ _ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _ _ _|
+                                                                 ||
+                                                                 \/
+                 _ _ _ _ _ _ _ _ _ _ _                  _ _ _ _ _ _ _ _ _ _ _
+                |                     |                |                     |
+                | WASM Graph Compiler |                |  TVM Compiler Stack |
+                |    (TVM runtime)    |                |_ _ _ _ _ _ _ _ _ _ _|

Review comment:
       I am not too sure about what is the purpose of Graph Compiler here, perhaps we can directly use TVM to generate the graph?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] tqchen commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
tqchen commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-647898735


   Thanks @leonwanghui  Given that TVM is a compilation stack, directly generating generic ops may not give the best performance for deployment, as the compilation approach takes a lot of shape specialization and fusions. 
   
   A better approach for integration might be creating a relay or TIR-level integration from the framework to the TVM's unified IR, then let the compilation produces a standalone module.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r459809422



##########
File path: apps/wasm-standalone/wasm-graph/src/utils.rs
##########
@@ -0,0 +1,48 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use super::types::*;
+use serde_json;
+use std::ptr;
+
+pub fn load_input(in_addr: i32, in_size: usize) -> Tensor {

Review comment:
       One quick question, if a function is defined with `unsafe` flag, do I need to remove the unsafe statement inside that function?
   ```
   pub unsafe fn store_output(out_addr: i32, output: Tensor) -> usize {
       let out_addr = out_addr as *mut u8;
   
       let data_vec = serde_json::to_vec(&output).unwrap();
       let data_size = data_vec.len();
       for i in 0..data_size {
           unsafe {
               ptr::write(out_addr.offset(i as isize), *data_vec.get(i).unwrap());
           }
       }
   
       data_size
   }
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] tqchen commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
tqchen commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-648248759


   @jroesch is already working on some of the RUST FFI access. We can also do that through python or other language binding and only use the rust runtime to execute in wasm.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-654001303


   @tqchen @jroesch @kazum Now this PR is updated, please take a review, thanks!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] nhynes commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
nhynes commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-660776031


   oh, also, it might be useful to have tensor contents be base64 serialized, if serde support is going to be upstreamed and json is the most common serializer


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r457818662



##########
File path: apps/wasm-standalone/wasm-graph/src/lib.rs
##########
@@ -0,0 +1,83 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+#[macro_use]
+extern crate lazy_static;
+#[macro_use]
+extern crate serde_derive;
+extern crate ndarray;
+extern crate tvm_runtime;
+
+mod types;
+use types::Tensor;

Review comment:
       Already updated, thx!




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-664156537


   cc @tqchen Your comments have been addressed, PTAL


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] nhynes commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
nhynes commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-664024078


   @tqchen This PR looks fine. I have a few style nits, but I don't think they'd be well received.
   
   Thanks @leonwanghui for the contribution!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-657915075


   @tqchen It seems that the `./cargo/config` file is not allowed to be included into the repo, the ci error is as follows:
   ```shell
   + docker/bash.sh tvmai/ci-lint:v0.61 ./tests/scripts/task_lint.sh
   
   WORKSPACE: /scratch/jenkins-tvm/cudabuild/workspace/exec_3/tvm/sanity
   
   DOCKER CONTAINER NAME: tvmai/ci-lint:v0.61
   
   
   
   Running './tests/scripts/task_lint.sh' inside tvmai/ci-lint:v0.61...
   
   mesg: ttyname failed: Inappropriate ioctl for device
   
   Adding group `tvm' (GID 1000) ...
   
   Done.
   
   Check file types...
   
   ------File type check report----
   
   apps/wasm-standalone/wasm-graph/.cargo/config
   
   Found 1 files that are now allowed
   
   We do not check in binary files into the repo.
   
   If necessary, please discuss with committers andmodify tests/lint/check_file_type.py to enable the file you need.
   
   script returned exit code 255
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui edited a comment on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui edited a comment on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-663316322


   cc @tqchen @nhynes @jroesch, I think this PR is ready to go, please notice that some of `clippy` warnings in `tvm-graph-rt` module would not be addressed in this PR.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-651463246


   Thanks @tqchen, it seems promising to me. I will finish the PoC and update this PR later.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] tqchen commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
tqchen commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-660269409


   Thanks @leonwanghui , also cc @jroesch @nhynes for a quick look


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] tqchen commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
tqchen commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r460627146



##########
File path: apps/wasm-standalone/README.md
##########
@@ -0,0 +1,207 @@
+<!--- Licensed to the Apache Software Foundation (ASF) under one -->
+<!--- or more contributor license agreements.  See the NOTICE file -->
+<!--- distributed with this work for additional information -->
+<!--- regarding copyright ownership.  The ASF licenses this file -->
+<!--- to you under the Apache License, Version 2.0 (the -->
+<!--- "License"); you may not use this file except in compliance -->
+<!--- with the License.  You may obtain a copy of the License at -->
+
+<!---   http://www.apache.org/licenses/LICENSE-2.0 -->
+
+<!--- Unless required by applicable law or agreed to in writing, -->
+<!--- software distributed under the License is distributed on an -->
+<!--- "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -->
+<!--- KIND, either express or implied.  See the License for the -->
+<!--- specific language governing permissions and limitations -->
+<!--- under the License. -->
+
+# WebAssembly Standalone for Deep Learning Framework with TVM Runtime
+
+#### Experimental notice: This project is still *experimental* and only serves as a proof of concept for running deep learning frameworks on [WebAssembly runtime](https://github.com/bytecodealliance/wasmtime) with [TVM stack](https://tvm.apache.org/).
+
+- [WebAssembly Standalone for Deep Learning Framework with TVM Runtime](#webassembly-standalone-for-deep-learning-framework-with-tvm-runtime)
+    - [Motivation](#motivation)
+    - [Framework Landscape](#framework-landscape)
+    - [Project Status](#project-status)
+    - [PoC Guidelines](#poc-guidelines)
+        - [Pre-installation](#pre-installation)
+        - [Build ResNet50 model](#build-resnet50-model)
+        - [Build wasm-graph package](#build-wasm-graph-package)
+        - [Test](#test)
+    - [Future Work](#future-work)
+        - [More networks support](#more-networks-support)
+        - [Performance benchmark](#performance-benchmark)
+        - [Native TVM Rust runtime support](#native-tvm-rust-runtime-support)
+    - [Appendix](#appendix)
+        - [System packages install](#system-packages-install)
+    - [Contribution](#contribution)
+
+## Motivation
+
+<img src="https://github.com/dmlc/web-data/raw/master/tvm/tutorial/tvm_support_list.png" alt="TVM hardware support" width="600"/>
+
+As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized graph compiler for all deep learning frameworks.
+
+## Framework Landscape
+
+The figures below demonstrate the whole landscape of running deep learning frameworks on WASM runtime with TVM compiler stack.
+
+* WASM graph generation
+    ```
+       _ _ _ _ _ _ _ _ _ _        _ _ _ _ _ _ _        _ _ _ _ _ _ _ _ _ _ _ _
+      |                   |      |             |      |                       |
+      |  Framework Model  | ---> |  ONNX Model | ---> |  TVM Relay Python API |
+      |_ _ _ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _ _ _|
+                                                                 ||
+                                                                 \/
+                 _ _ _ _ _ _ _ _ _ _ _                  _ _ _ _ _ _ _ _ _ _ _
+                |                     |                |                     |
+                | WASM Graph Builder  |                |  TVM Compiler Stack |
+                |    (TVM runtime)    |                |_ _ _ _ _ _ _ _ _ _ _|
+                |_ _ _ _ _ _ _ _ _ _ _|                          ||
+                          ||                                     \/
+      _ _ _ _ _ _ _ _ _   ||   _ _ _ _ _ _ _ _ _ _            _ _ _ _ _
+     |                 |  \/  |                   |  llvm-ar |         |
+     | wasm_graph.wasm | <--- | libgraph_wasm32.a | <------- | graph.o |
+     |_ _ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _|          |_ _ _ _ _|
+    ```
+
+* WASM graph loading
+    ```
+         _ _ _ _ _ _ _ _ _ _ _
+        |                     |
+        |  WASM Graph Loader  |
+        |   (WASM runtime)    |
+        |_ _ _ _ _ _ _ _ _ _ _|
+                  ||
+                  \/
+          _ _ _ _ _ _ _ _ _ _
+         |                   |
+         |  wasm_graph.wasm  |
+         |_ _ _ _ _ _ _ _ _ _|
+    ```
+
+## Project Status
+
+This project should be considered **experimental** at the very early stage, all rich features are under active development. Here is the current operator support matrix:
+
+| Model Name | Status |
+| ---------- | ------ |
+| ResNet50 | ✔️ |
+| LeNet | <center>&mdash;</center> |
+
+**NOTICE**: Currently this project is ONLY tested on Ubuntu system, so `Ubuntu 16.04+` should be prepared as the testing environment.
+
+## PoC Guidelines
+
+### Pre-installation
+
+* Rust
+
+    Before running this demo, please make sure [Rust](#system-packages-install) has been installed.
+
+    After Rust installed, execute the code below to add `wasm32-wasi` target:
+    ```shell
+    rustup target add wasm32-wasi
+    ```
+
+* TVM
+
+    Please follow TVM [installations](https://tvm.apache.org/docs/install/index.html) for the detailed instruction.
+
+* LLVM
+
+    `LLVM 10.0` or later is REQUIRED.
+
+### Build ResNet50 model
+
+- Build DL library in the WebAssembly format.
+
+  - Download model
+
+    ```
+    cd wasm-graph/tools && wget https://s3.amazonaws.com/onnx-model-zoo/resnet/resnet50v1/resnet50v1.onnx
+    ```
+
+  - Compile
+
+    ```
+    LLVM_AR=llvm-ar-10 python ./build_graph_lib.py -O3 ./resnet50v1.onnx
+    ```
+
+### Build wasm-graph package
+
+```shell
+cd wasm-graph && cargo build --release
+cp ./target/wasm32-wasi/release/wasm_graph.wasm ./lib/wasm_graph_resnet50.wasm
+```
+
+### Test
+
+Before running this demo, please make sure [`Rust`](#system-packages-install) has been installed.
+
+Next run the command below to install the runtime package for testing (`rust` REQUIRED):
+
+```shell
+cd wasm-runtime/tests/test_graph_resnet50 && cargo build
+```
+
+Check the usage of `test_graph_resnet50`:
+
+```shell
+~# ./target/debug/test_graph_resnet50 -h
+
+Usage: ./target/debug/test_graph_resnet50 [options]
+
+Options:
+    -g, --wasm-graph-file FILE_PATH
+                        set the path to wasm graph file
+    -i, --input-data-file FILE_PATH
+                        set the path to input image file
+    -l, --label-class-file FILE_PATH
+                        set the path to label class file
+    -h, --help          print this help menu
+```
+
+Next perform model inference using these commands below:
+```
+$ cp ../../../wasm-graph/lib/wasm_graph_resnet50.wasm ./
+$ wget -O cat.png https://github.com/dmlc/mxnet.js/blob/master/data/cat.png?raw=true
+$ wget -O synset.csv https://raw.githubusercontent.com/kazum/tvm-wasm/master/synset.csv
+$ ./target/debug/test_graph_resnet50 -g ./wasm_graph_resnet50.wasm -i ./cat.png -l ./synset.csv
+original image dimensions: (256, 256)
+resized image dimensions: (224, 224)
+input image belongs to the class `tabby, tabby cat`
+```
+
+## Future Work
+
+### More networks support
+TODO
+
+### Performance benchmark
+
+We are working on several improvements on performances:
+* WebAssembly simd128 support (**Done**)
+* Auto-tvm enhancement for llvm target
+
+### Native TVM Rust runtime support
+TODO
+
+## Appendix
+
+### System packages install
+
+* Rust (latest version)
+
+    If you are running Windows, to install Rust, download and run the [RUST-INIT.EXE](https://win.rustup.rs/), and then follow the onscreen instructions.
+
+    If you are a Linux user, run the following in your terminal, then follow the on-screen instructions to install Rust.
+
+    ```shell
+    curl https://sh.rustup.rs -sSf | sh
+    ```
+
+## Contribution
+

Review comment:
       remove the ack since the project will evolve and we generally remove author info from the project(commit history suffice), we can also add people as co-author




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] nhynes commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
nhynes commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r458315283



##########
File path: apps/wasm-standalone/wasm-graph/src/utils.rs
##########
@@ -0,0 +1,48 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use super::types::*;
+use serde_json;
+use std::ptr;
+
+pub fn load_input(in_addr: i32, in_size: usize) -> Tensor {

Review comment:
       > I'm afraid the memory errors would be difficult to debug
   What? This function is semantically unsafe, therefore it should be `pub unsafe fn load_input`. The point is to make the places where it's called self-documenting.

##########
File path: apps/wasm-standalone/wasm-graph/src/utils.rs
##########
@@ -0,0 +1,48 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use super::types::*;
+use serde_json;
+use std::ptr;
+
+pub fn load_input(in_addr: i32, in_size: usize) -> Tensor {

Review comment:
       > I'm afraid the memory errors would be difficult to debug
   
   What? This function is semantically unsafe, therefore it should be `pub unsafe fn load_input`. The point is to make the places where it's called self-documenting.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r457808668



##########
File path: apps/wasm-standalone/wasm-graph/src/utils.rs
##########
@@ -0,0 +1,48 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use super::types::*;
+use serde_json;
+use std::ptr;
+
+pub fn load_input(in_addr: i32, in_size: usize) -> Tensor {

Review comment:
       Considering this module would be built into wasm bytecode, I'm afraid the memory errors would be difficult to debug.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui edited a comment on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui edited a comment on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-663316322


   cc @tqchen @nhynes, I think this PR is ready to go, please notice that some of `clippy` warnings in `tvm-graph-rt` module would not be addressed in this PR.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r460629761



##########
File path: apps/wasm-standalone/README.md
##########
@@ -0,0 +1,207 @@
+<!--- Licensed to the Apache Software Foundation (ASF) under one -->
+<!--- or more contributor license agreements.  See the NOTICE file -->
+<!--- distributed with this work for additional information -->
+<!--- regarding copyright ownership.  The ASF licenses this file -->
+<!--- to you under the Apache License, Version 2.0 (the -->
+<!--- "License"); you may not use this file except in compliance -->
+<!--- with the License.  You may obtain a copy of the License at -->
+
+<!---   http://www.apache.org/licenses/LICENSE-2.0 -->
+
+<!--- Unless required by applicable law or agreed to in writing, -->
+<!--- software distributed under the License is distributed on an -->
+<!--- "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -->
+<!--- KIND, either express or implied.  See the License for the -->
+<!--- specific language governing permissions and limitations -->
+<!--- under the License. -->
+
+# WebAssembly Standalone for Deep Learning Framework with TVM Runtime
+
+#### Experimental notice: This project is still *experimental* and only serves as a proof of concept for running deep learning frameworks on [WebAssembly runtime](https://github.com/bytecodealliance/wasmtime) with [TVM stack](https://tvm.apache.org/).
+
+- [WebAssembly Standalone for Deep Learning Framework with TVM Runtime](#webassembly-standalone-for-deep-learning-framework-with-tvm-runtime)
+    - [Motivation](#motivation)
+    - [Framework Landscape](#framework-landscape)
+    - [Project Status](#project-status)
+    - [PoC Guidelines](#poc-guidelines)
+        - [Pre-installation](#pre-installation)
+        - [Build ResNet50 model](#build-resnet50-model)
+        - [Build wasm-graph package](#build-wasm-graph-package)
+        - [Test](#test)
+    - [Future Work](#future-work)
+        - [More networks support](#more-networks-support)
+        - [Performance benchmark](#performance-benchmark)
+        - [Native TVM Rust runtime support](#native-tvm-rust-runtime-support)
+    - [Appendix](#appendix)
+        - [System packages install](#system-packages-install)
+    - [Contribution](#contribution)
+
+## Motivation
+
+<img src="https://github.com/dmlc/web-data/raw/master/tvm/tutorial/tvm_support_list.png" alt="TVM hardware support" width="600"/>
+
+As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized graph compiler for all deep learning frameworks.
+
+## Framework Landscape
+
+The figures below demonstrate the whole landscape of running deep learning frameworks on WASM runtime with TVM compiler stack.
+
+* WASM graph generation
+    ```
+       _ _ _ _ _ _ _ _ _ _        _ _ _ _ _ _ _        _ _ _ _ _ _ _ _ _ _ _ _
+      |                   |      |             |      |                       |
+      |  Framework Model  | ---> |  ONNX Model | ---> |  TVM Relay Python API |
+      |_ _ _ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _ _ _|
+                                                                 ||
+                                                                 \/
+                 _ _ _ _ _ _ _ _ _ _ _                  _ _ _ _ _ _ _ _ _ _ _
+                |                     |                |                     |
+                | WASM Graph Builder  |                |  TVM Compiler Stack |
+                |    (TVM runtime)    |                |_ _ _ _ _ _ _ _ _ _ _|
+                |_ _ _ _ _ _ _ _ _ _ _|                          ||
+                          ||                                     \/
+      _ _ _ _ _ _ _ _ _   ||   _ _ _ _ _ _ _ _ _ _            _ _ _ _ _
+     |                 |  \/  |                   |  llvm-ar |         |
+     | wasm_graph.wasm | <--- | libgraph_wasm32.a | <------- | graph.o |
+     |_ _ _ _ _ _ _ _ _|      |_ _ _ _ _ _ _ _ _ _|          |_ _ _ _ _|
+    ```
+
+* WASM graph loading
+    ```
+         _ _ _ _ _ _ _ _ _ _ _
+        |                     |
+        |  WASM Graph Loader  |
+        |   (WASM runtime)    |
+        |_ _ _ _ _ _ _ _ _ _ _|
+                  ||
+                  \/
+          _ _ _ _ _ _ _ _ _ _
+         |                   |
+         |  wasm_graph.wasm  |
+         |_ _ _ _ _ _ _ _ _ _|
+    ```
+
+## Project Status
+
+This project should be considered **experimental** at the very early stage, all rich features are under active development. Here is the current operator support matrix:
+
+| Model Name | Status |
+| ---------- | ------ |
+| ResNet50 | ✔️ |
+| LeNet | <center>&mdash;</center> |
+
+**NOTICE**: Currently this project is ONLY tested on Ubuntu system, so `Ubuntu 16.04+` should be prepared as the testing environment.
+
+## PoC Guidelines
+
+### Pre-installation
+
+* Rust
+
+    Before running this demo, please make sure [Rust](#system-packages-install) has been installed.
+
+    After Rust installed, execute the code below to add `wasm32-wasi` target:
+    ```shell
+    rustup target add wasm32-wasi
+    ```
+
+* TVM
+
+    Please follow TVM [installations](https://tvm.apache.org/docs/install/index.html) for the detailed instruction.
+
+* LLVM
+
+    `LLVM 10.0` or later is REQUIRED.
+
+### Build ResNet50 model
+
+- Build DL library in the WebAssembly format.
+
+  - Download model
+
+    ```
+    cd wasm-graph/tools && wget https://s3.amazonaws.com/onnx-model-zoo/resnet/resnet50v1/resnet50v1.onnx
+    ```
+
+  - Compile
+
+    ```
+    LLVM_AR=llvm-ar-10 python ./build_graph_lib.py -O3 ./resnet50v1.onnx
+    ```
+
+### Build wasm-graph package
+
+```shell
+cd wasm-graph && cargo build --release
+cp ./target/wasm32-wasi/release/wasm_graph.wasm ./lib/wasm_graph_resnet50.wasm
+```
+
+### Test
+
+Before running this demo, please make sure [`Rust`](#system-packages-install) has been installed.
+
+Next run the command below to install the runtime package for testing (`rust` REQUIRED):
+
+```shell
+cd wasm-runtime/tests/test_graph_resnet50 && cargo build
+```
+
+Check the usage of `test_graph_resnet50`:
+
+```shell
+~# ./target/debug/test_graph_resnet50 -h
+
+Usage: ./target/debug/test_graph_resnet50 [options]
+
+Options:
+    -g, --wasm-graph-file FILE_PATH
+                        set the path to wasm graph file
+    -i, --input-data-file FILE_PATH
+                        set the path to input image file
+    -l, --label-class-file FILE_PATH
+                        set the path to label class file
+    -h, --help          print this help menu
+```
+
+Next perform model inference using these commands below:
+```
+$ cp ../../../wasm-graph/lib/wasm_graph_resnet50.wasm ./
+$ wget -O cat.png https://github.com/dmlc/mxnet.js/blob/master/data/cat.png?raw=true
+$ wget -O synset.csv https://raw.githubusercontent.com/kazum/tvm-wasm/master/synset.csv
+$ ./target/debug/test_graph_resnet50 -g ./wasm_graph_resnet50.wasm -i ./cat.png -l ./synset.csv
+original image dimensions: (256, 256)
+resized image dimensions: (224, 224)
+input image belongs to the class `tabby, tabby cat`
+```
+
+## Future Work
+
+### More networks support
+TODO
+
+### Performance benchmark
+
+We are working on several improvements on performances:
+* WebAssembly simd128 support (**Done**)
+* Auto-tvm enhancement for llvm target
+
+### Native TVM Rust runtime support
+TODO
+
+## Appendix
+
+### System packages install
+
+* Rust (latest version)
+
+    If you are running Windows, to install Rust, download and run the [RUST-INIT.EXE](https://win.rustup.rs/), and then follow the onscreen instructions.
+
+    If you are a Linux user, run the following in your terminal, then follow the on-screen instructions to install Rust.
+
+    ```shell
+    curl https://sh.rustup.rs -sSf | sh
+    ```
+
+## Contribution
+

Review comment:
       Got it




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on a change in pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on a change in pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#discussion_r457815499



##########
File path: apps/wasm-standalone/wasm-graph/src/types.rs
##########
@@ -0,0 +1,183 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+use std::{
+    any::TypeId,
+    convert::From,
+    os::raw::{c_int, c_void},
+    slice,
+};
+pub use tvm_common::ffi::DLTensor;
+use tvm_common::ffi::{
+    DLContext, DLDataType, DLDataTypeCode_kDLFloat, DLDataTypeCode_kDLInt, DLDeviceType_kDLCPU,
+};
+
+#[derive(Debug, PartialEq, Clone, Serialize, Deserialize)]
+pub enum DataType {
+    FP32,
+    INT32,
+    INT8,
+}
+
+impl DataType {
+    pub fn as_dldtype(&self) -> DLDataType {
+        match self {
+            DataType::INT32 => DLDataType {
+                code: DLDataTypeCode_kDLInt as u8,
+                bits: 32u8,
+                lanes: 1u16,
+            },
+            DataType::INT8 => DLDataType {
+                code: DLDataTypeCode_kDLInt as u8,
+                bits: 8u8,
+                lanes: 1u16,
+            },
+            DataType::FP32 => DLDataType {
+                code: DLDataTypeCode_kDLFloat as u8,
+                bits: 32u8,
+                lanes: 1u16,
+            },
+        }
+    }
+
+    /// Returns whether this `DataType` represents primitive type `T`.
+    pub fn is_type<T: 'static>(&self) -> bool {
+        let typ = TypeId::of::<T>();
+        typ == TypeId::of::<i32>() || typ == TypeId::of::<i8>() || typ == TypeId::of::<f32>()
+    }
+}
+
+impl From<DLDataType> for DataType {
+    fn from(dl_dtype: DLDataType) -> Self {
+        if dl_dtype.code == DLDataTypeCode_kDLInt as u8 && dl_dtype.bits == 32u8 {
+            DataType::INT32
+        } else if dl_dtype.code == DLDataTypeCode_kDLInt as u8 && dl_dtype.bits == 8u8 {
+            DataType::INT8
+        } else if dl_dtype.code == DLDataTypeCode_kDLFloat as u8 && dl_dtype.bits == 32u8 {
+            DataType::FP32
+        } else {
+            DataType::FP32
+        }
+    }
+}
+
+#[derive(Debug, Clone, Serialize, Deserialize)]
+pub struct Tensor {

Review comment:
       Sure, I would make this change after the rust runtime being stablized.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] leonwanghui commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
leonwanghui commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-647905130


   @tqchen Thanks for your suggestion, if we target relay or TIR-level integration, does that mean we need to create Rust FFI for them at first?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-tvm] tqchen commented on pull request #5892: Add TVM application extension with WASM runtime

Posted by GitBox <gi...@apache.org>.
tqchen commented on pull request #5892:
URL: https://github.com/apache/incubator-tvm/pull/5892#issuecomment-657946685


   cc @kazum 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org