You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi all,
I am trying to build up a lib based on Wasmedge's wasi-nn supported backends.
But kind of struggling with creating a VM that is able to call functions in the wasm.lib.
The example in https://github.com/second-state/wasmedge-rustsdk-examples/blob/main/object-detection-via-wasinn/src/main.rs shows that the wasi module has to be initialized for executing ML tasks with some args but if u simply execute that infer function twice,
So can we create a VM and initialize the wasi-module without passing args? Can we reuse the created VM?
To test wasmedge according to those questions above, my experiments with WasmEdge is shown below.
I created a lib.rs (compiled to wasm) is roughly shown as follows, it is created based on a Yomo-wasmedge-tensorflow example It uses wasmedge_tensorflow_interface:
use std::time::Instant;
use wasmedge_bindgen::*;
use wasmedge_bindgen_macro::*;
use wasmedge_tensorflow_interface;
use serde::{Deserialize, Serialize};
//pytorch_infer is just for testing if it can be called from the host code.
#[wasmedge_bindgen]
pub fn pytorch_infer(image_data: Vec<u8>) -> Vec<u8> {
let result = image_data.clone();
println!("the pytorch_infer is invoked correctly");
return result;
}
//tf_infer is based on a yomo go example, the lib.rs designed in that
#[wasmedge_bindgen]
pub fn tf_infer(image_data: Vec<u8>) -> Result<Vec<u8>, String> {
let start = Instant::now();
let model_data: &[u8] = include_bytes!("lite-model_aiy_vision_classifier_food_V1_1.tflite");
let labels = include_str!("aiy_food_V1_labelmap.txt");
let flat_img = wasmedge_tensorflow_interface::load_jpg_image_to_rgb8(&image_data[..], 192, 192);
println!("RUST: Loaded image in ... {:?}", start.elapsed());
let mut session = wasmedge_tensorflow_interface::Session::new(
&model_data,
wasmedge_tensorflow_interface::ModelType::TensorFlowLite,
);
session
.add_input("input", &flat_img, &[1, 192, 192, 3])
.run();
let res_vec: Vec<u8> = session.get_output("MobilenetV1/Predictions/Softmax");
let mut i = 0;
let mut max_index: i32 = -1;
let mut max_value: u8 = 0;
while i < res_vec.len() {
let cur = res_vec[i];
if cur > max_value {
max_value = cur;
max_index = i as i32;
}
i += 1;
}
println!("RUST: index {}, prob {}", max_index, max_value);
let confidence: String;
if max_value > 200 {
confidence = "is very likely".to_string();
} else if max_value > 125 {
confidence = "is likely".to_string();
} else {
confidence = "could be".to_string();
}
let ret_str: String;
if max_value > 50 {
let mut label_lines = labels.lines();
for _i in 0..max_index {
label_lines.next();
}
let food_name = label_lines.next().unwrap().to_string();
ret_str = format!(
"It {} a <a href='https://www.google.com/search?q={}'>{}</a> in the picture",
confidence, food_name, food_name
);
} else {
ret_str = "It does not appears to be a food item in the picture.".to_string();
}
println!(
"RUST: Finished post-processing in ... {:?}",
start.elapsed()
);
return Ok(ret_str.as_bytes().to_vec());
}
Then I created the main function as following:
It uses wasmedge_tensorflow_interface. PluginManager is not used.
The vm is processed by VmDock::new(vm) to be able to pass complex data such as vec . The vm.wasi_module_mut().initialize() is not used.
But it eventually went into error, I tried different ways to fix it, but don't know what is missing here.
use std::{fs::File, io::Read, path::Path};
use image;
#[cfg(all(target_os = "linux", target_arch = "x86_64"))]
use wasmedge_sdk::{
config::{CommonConfigOptions, ConfigBuilder, HostRegistrationConfigOptions},
dock::{Param,VmDock},
plugin::PluginManager,
Module, VmBuilder,
};
fn main() -> Result<(), Box<dyn std::error::Error>> {
#[cfg(all(target_os = "linux", target_arch = "x86_64"))]
infer()?;
Ok(())
}
#[cfg(all(target_os = "linux", target_arch = "x86_64"))]
fn infer() -> Result<(), Box<dyn std::error::Error>> {
let wasm_file = &String::from("rust_mobilenet_food_lib.wasm");
let image_name = &String::from("banana.jpg");
//image to tensor
let image_tensor_data = image_to_tensor(image_name.to_string(), 192, 192);
// //load plugin
// PluginManager::load(None)?;
// config for creating a VM
let config = ConfigBuilder::new(CommonConfigOptions::default())
.with_host_registration_config(HostRegistrationConfigOptions::default().wasi(true))
.build()?;
assert!(config.wasi_enabled());
// load wasm module from file
// let module = Module::from_file(Some(&config), wasm_file)?;
let module = Module::from_file(None, wasm_file)?;
// create vm without checking wasinn
let vm = VmBuilder::new()
.with_config(config)
.build()?
.register_module(Some("infer_lib"), module)?;
// create a Vm, with wasi_nn plugin, douplicated with another on below
// let vm = VmBuilder::new()
// .with_config(config)
// .with_plugin_wasi_nn()
// .build()?
// .register_module(Some("infer_lib"), module)?;
// VmDock return a new one, in order to pass complex data such as vec<u8> refering the wasmedge-bindgen example
let vm = VmDock::new(vm);
let params = image_tensor_data;
// check the data type of params
let params = vec![Param::VecU8(¶ms)];
match vm.run_func("tf_infer", params)? {
Ok(mut res) => {
println!(
"Run bindgen -- tf_infer: {:?}",
res.pop().unwrap().downcast::<Vec<u8>>().unwrap()
);
}
Err(err) => {
println!("Run bindgen -- tf_infer FAILED {}", err);
}
}
Ok(())
}
fn image_to_tensor(path: String, height: u32, width: u32) -> Vec<u8> {
let mut file_img = File::open(path).unwrap();
let mut img_buf = Vec::new();
file_img.read_to_end(&mut img_buf).unwrap();
let img = image::load_from_memory(&img_buf).unwrap().to_rgb8();
let resized =
image::imageops::resize(&img, height, width, ::image::imageops::FilterType::Triangle);
let mut flat_img: Vec<f32> = Vec::new();
for rgb in resized.pixels() {
flat_img.push((rgb[0] as f32 / 255. - 0.485) / 0.229);
flat_img.push((rgb[1] as f32 / 255. - 0.456) / 0.224);
flat_img.push((rgb[2] as f32 / 255. - 0.406) / 0.225);
}
let bytes_required = flat_img.len() * 4;
let mut u8_f32_arr: Vec<u8> = vec![0; bytes_required];
for c in 0..3 {
for i in 0..(flat_img.len() / 3) {
// Read the number as a f32 and break it into u8 bytes
let u8_f32: f32 = flat_img[i * 3 + c] as f32;
let u8_bytes = u8_f32.to_ne_bytes();
for j in 0..4 {
u8_f32_arr[((flat_img.len() / 3 * c + i) * 4) + j] = u8_bytes[j];
}
}
}
return u8_f32_arr;
}
and got the error:
[2023-09-18 17:31:47.668] [error] instantiation failed: unknown import, Code: 0x62
[2023-09-18 17:31:47.669] [error] When linking module: "wasmedge_tensorflowlite" , function name: "wasmedge_tensorflowlite_create_session"
[2023-09-18 17:31:47.669] [error] At AST node: import description
[2023-09-18 17:31:47.669] [error] At AST node: import section
[2023-09-18 17:31:47.669] [error] At AST node: module
Error: Core(Instantiation(UnknownImport))
I am not sure what is missing in the main.rs code, and I am wondering if we can create the VM to call infer() function in a compiled lib.rs (using wasi-nn plugins or TF/TF-lite interface ), and can we reuse the created VM for infer() just like the wasmedge-bindgen examples to call infer() in the lib.rs?
And how would that happen?
My description is kind of too long, but thanks in advance. :)
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi all,
I am trying to build up a lib based on Wasmedge's wasi-nn supported backends.
But kind of struggling with creating a VM that is able to call functions in the wasm.lib.
The example in https://github.com/second-state/wasmedge-rustsdk-examples/blob/main/object-detection-via-wasinn/src/main.rs shows that the wasi module has to be initialized for executing ML tasks with some args but if u simply execute that infer function twice,
you will get an error:
So can we create a VM and initialize the wasi-module without passing args? Can we reuse the created VM?
To test wasmedge according to those questions above, my experiments with WasmEdge is shown below.
I created a lib.rs (compiled to wasm) is roughly shown as follows, it is created based on a Yomo-wasmedge-tensorflow example It uses wasmedge_tensorflow_interface:
Then I created the main function as following:
It uses wasmedge_tensorflow_interface. PluginManager is not used.
The vm is processed by VmDock::new(vm) to be able to pass complex data such as vec . The vm.wasi_module_mut().initialize() is not used.
But it eventually went into error, I tried different ways to fix it, but don't know what is missing here.
and got the error:
I am not sure what is missing in the main.rs code, and I am wondering if we can create the VM to call infer() function in a compiled lib.rs (using wasi-nn plugins or TF/TF-lite interface ), and can we reuse the created VM for infer() just like the wasmedge-bindgen examples to call infer() in the lib.rs?
And how would that happen?
My description is kind of too long, but thanks in advance. :)
Beta Was this translation helpful? Give feedback.
All reactions