WebAssembly Serverless Functions in Netlify
In this article we will show you two serverless functions in Rust and WasmEdge deployed on Netlify. One is the image processing function, the other one is the TensorFlow inference function.
For more insights on why WasmEdge on Netlify, please refer to the article WebAssembly Serverless Functions in Netlify.
Prerequisite
Since our demo WebAssembly functions are written in Rust, you will need a Rust compiler. Make sure that you install the wasm32-wasi
compiler target as follows, in order to generate WebAssembly bytecode.
rustup target add wasm32-wasi
The demo application front end is written in Next.js, and deployed on Netlify. We will assume that you already have the basic knowledge of how to work with Next.js and Netlify.
Example 1: Image processing
Our first demo application allows users to upload an image and then invoke a serverless function to turn it into black and white. A live demo deployed on Netlify is available.
Fork the demo application’s GitHub repo to get started. To deploy the application on Netlify, just add your github repo to Netlify.
This repo is a standard Next.js application for the Netlify platform. The backend serverless function is in the api/functions/image_grayscale
folder. The src/main.rs
file contains the Rust program’s source code. The Rust program reads image data from the STDIN
, and then outputs the black-white image to the STDOUT
.
use hex;
use std::io::{self, Read};
use image::{ImageOutputFormat, ImageFormat};
fn main() {
let mut buf = Vec::new();
io::stdin().read_to_end(&mut buf).unwrap();
let image_format_detected: ImageFormat = image::guess_format(&buf).unwrap();
let img = image::load_from_memory(&buf).unwrap();
let filtered = img.grayscale();
let mut buf = vec![];
match image_format_detected {
ImageFormat::Gif => {
filtered.write_to(&mut buf, ImageOutputFormat::Gif).unwrap();
},
_ => {
filtered.write_to(&mut buf, ImageOutputFormat::Png).unwrap();
},
};
io::stdout().write_all(&buf).unwrap();
io::stdout().flush().unwrap();
}
You can use Rust’s cargo
tool to build the Rust program into WebAssembly bytecode or native code.
cd api/functions/image-grayscale/
cargo build --release --target wasm32-wasi
Copy the build artifacts to the api
folder.
cp target/wasm32-wasi/release/grayscale.wasm ../../
The Netlify function runs
api/pre.sh
upon setting up the serverless environment. It installs the WasmEdge runtime, and then compiles each WebAssembly bytecode program into a nativeso
library for faster execution.
The api/hello.js
script loads the WasmEdge runtime, starts the compiled WebAssembly program in WasmEdge, and passes the uploaded image data via STDIN
. Notice api/hello.js
runs the compiled grayscale.so
file generated by api/pre.sh
for better performance.
const fs = require('fs');
const { spawn } = require('child_process');
const path = require('path');
module.exports = (req, res) => {
const wasmedge = spawn(path.join(__dirname, 'wasmedge'), [
path.join(__dirname, 'grayscale.so'),
]);
let d = [];
wasmedge.stdout.on('data', (data) => {
d.push(data);
});
wasmedge.on('close', (code) => {
let buf = Buffer.concat(d);
res.setHeader('Content-Type', req.headers['image-type']);
res.send(buf);
});
wasmedge.stdin.write(req.body);
wasmedge.stdin.end('');
};
That's it. Deploy the repo to Netlify and you now have a Netlify Jamstack app with a high-performance Rust and WebAssembly based serverless backend.
Example 2: AI inference
The second demo application allows users to upload an image and then invoke a serverless function to classify the main subject on the image.
It is in the same GitHub repo as the previous example but in the tensorflow
branch. The backend serverless function for image classification is in the api/functions/image-classification
folder in the tensorflow
branch. The src/main.rs
file contains the Rust program’s source code. The Rust program reads image data from the STDIN
, and then outputs the text output to the STDOUT
. It utilizes the WasmEdge Tensorflow API to run the AI inference.
pub fn main() {
// Step 1: Load the TFLite model
let model_data: &[u8] = include_bytes!("models/mobilenet_v1_1.0_224/mobilenet_v1_1.0_224_quant.tflite");
let labels = include_str!("models/mobilenet_v1_1.0_224/labels_mobilenet_quant_v1_224.txt");
// Step 2: Read image from STDIN
let mut buf = Vec::new();
io::stdin().read_to_end(&mut buf).unwrap();
// Step 3: Resize the input image for the tensorflow model
let flat_img = wasmedge_tensorflow_interface::load_jpg_image_to_rgb8(&buf, 224, 224);
// Step 4: AI inference
let mut session = wasmedge_tensorflow_interface::Session::new(&model_data, wasmedge_tensorflow_interface::ModelType::TensorFlowLite);
session.add_input("input", &flat_img, &[1, 224, 224, 3])
.run();
let res_vec: Vec<u8> = session.get_output("MobilenetV1/Predictions/Reshape_1");
// Step 5: Find the food label that responds to the highest probability in res_vec
// ... ...
let mut label_lines = labels.lines();
for _i in 0..max_index {
label_lines.next();
}
// Step 6: Generate the output text
let class_name = label_lines.next().unwrap().to_string();
if max_value > 50 {
println!("It {} a <a href='https://www.google.com/search?q={}'>{}</a> in the picture", confidence.to_string(), class_name, class_name);
} else {
println!("It does not appears to be any food item in the picture.");
}
}
You can use the cargo
tool to build the Rust program into WebAssembly bytecode or native code.
cd api/functions/image-classification/
cargo build --release --target wasm32-wasi
Copy the build artifacts to the api
folder.
cp target/wasm32-wasi/release/classify.wasm ../../
Again, the api/pre.sh
script installs WasmEdge runtime and its Tensorflow dependencies in this application. It also compiles the classify.wasm
bytecode program to the classify.so
native shared library at the time of deployment.
The api/hello.js
script loads the WasmEdge runtime, starts the compiled WebAssembly program in WasmEdge, and passes the uploaded image data via STDIN
. Notice api/hello.js
runs the compiled classify.so
file generated by api/pre.sh
for better performance.
const fs = require('fs');
const { spawn } = require('child_process');
const path = require('path');
module.exports = (req, res) => {
const wasmedge = spawn(
path.join(__dirname, 'wasmedge-tensorflow-lite'),
[path.join(__dirname, 'classify.so')],
{ env: { LD_LIBRARY_PATH: __dirname } },
);
let d = [];
wasmedge.stdout.on('data', (data) => {
d.push(data);
});
wasmedge.on('close', (code) => {
res.setHeader('Content-Type', `text/plain`);
res.send(d.join(''));
});
wasmedge.stdin.write(req.body);
wasmedge.stdin.end('');
};
You can now deploy your forked repo to Netlify and have a web app for subject classification.
Next, it's your turn to develop Rust serverless functions in Netlify using the netlify-wasm-runtime repo as a template. Looking forward to your great work.