Skip to main content

How WASI Makes Containerization More Efficient

May 13, 2021Announcements

By Marco Fioretti

WebAssembly, or Wasm for brevity, is a standardized binary format that allows software written in any language to run without customizations on any platform, inside sandboxes or runtimes – that is virtual machines – at near native speed. Since those runtimes are isolated from their host environment, a WebAssembly System Interface (WASI) gives developers – who adopt Wasm exactly to be free to write software once, but ignoring where it will run – a single, standard way to call the low-level functions that are present on any platform.

The previous article in this series describes the goals, design principles and architecture of WASI. This time, we present real-world, usable projects and services based on WASI, that also clarify its role in the big picture: to facilitate the containerization of virtually any application, much more efficiently than bulkier containers like Docker may do.

Coding with WASI is only half the job

Programmers can already write and compile code, for example in C or Rust, to create .wasm modules usable in any WASI-compliant environment. The problem is, do we already have runtimes that can actually execute those modules “outside web browsers”? The answer is yes, and more than one. One general-purpose solution is Wasmtime, from the Bytecode Alliance. This project develops a WASI-compliant runtime for Wasm modules that may be used standalone, as a command line tool, or be embedded into other applications, as a library: at the moment, besides plain Bash, Wasmtime is usable from Rust, C, Python, .NET and Go.

Other WASI runtimes are more or less optimized for particular use cases, or programming communities. The following examples give an idea of what is possible, without pretense at completeness.

WASI on servers, or REPLACING some servers

Wasmer is a Rust, open-source Wasm runtime, whose 1.0 version was released in January 2021. Wasmer is specifically designed to run – on generic servers – .Wasm modules that use WASI methods to interact with native functions of the host operating system.

Besides a standalone runtime that may run Wasm binaries on any platform and chipset, Wasmer is designed, like Wasmtime, to allow the use of Wasm modules from many other languages, starting from C/C++, Rust, Python, Go, PHP and Ruby.

To prove its capabilities, the developers of Wasmer have compiled as a .wasm module – and then actually run – an unmodified version of the nGinx web server, obviously using WASI calls to interact with the host system.

Wasmer also is the first Wasm runtime to fully support both WASI and high performance programming with the Single Instruction, Multiple Data technique (SIMD): in 2019, the two technologies were used together, with very interesting results, to emulate particle physics. Wasmer developers also participate in work to run Wasm modules on the Linux kernel to execute securely, via WASI, tasks that would otherwise need more checks and more context switching; that is performance hits.

Artificial Intelligence, faster than Docker and simpler than Node.js

Second State has developed another virtual machine to run server-side applications “safer and 10x faster than Docker”, called SSVM. What is particularly interesting in the SSVM runtime is why and how it added and optimized support for WebAssembly and WASI: direct access to hardware to provide Artificial Intelligence and machine learning “as a service in Node.js, written in Rust, over the Web”. Typical applications, running up to 25 times faster than equivalent Python code, include recognition of images and other patterns.

The SSVM toolchain can be used also to create Wasm modules for Deno. This is a Rust runtime for JavaScript and TypeScript created to address the “10 things the creator of Node.js regrets about it”, and supports WASI for Wasm modules that need to access system resources.

WASI gaming and more, right at the cloud edge

Fastly, an edge cloud platform provider, has developed and then released as Open Source its own WebAssembly compiler and runtime, called Lucet. Fastly created this tool specifically to support faster and safer execution of the code that its customers write in any language, for the several use cases of the Fastly platform. To show the capabilities of Wasm and WASI in edge computing, a Fastly engineer recently announced that he has ported the Doom first-person shooter game to run on Fastly’s edge cloud.

WebAssembly and containers? What’s the difference?

Using WASI and the already mentioned Wasmtime, it is possible both to run Wasm modules from .NET Core Applications, and to generate modules in the same format from .NET’s Roslyn compiler. Even more interesting are Microsoft’s Krustlets, that is “Kubernetes Rust kubelets”. These are a way to orchestrate and run WebAssembly “workloads” alongside standard containers, with Kubernetes. In other words Wasm and WASI can already enable the orchestration, with standard systems like Kubernetes, of thousands of generic applications, each isolated at least like with traditional containers – and side by side with them if needed – but with much smaller overhead.

A WASI-driven Internet of Things

The possibility to execute the same binary format on extremely efficient virtual machines that run on many different platforms means even more than it may seem at first sight, because:

“a WASI-enabled JavaScript runtime and simple firmware may keep a device’s software in sync with a cloud-hosted or locally hosted repository”.

In case you haven’t noticed, procedures like that may make automatic testing and deployment of new firmware or software for IoT, or any remote device, really, much easier and reliable than they are today. If a remote device can run WebAssembly bytecode, any developer may reliably write and test new software for it, simply using “basic simulators with digital twins” of that device, as discussed here. Isn’t WASI… interesting?

Thank you for your interest in Linux Foundation training and certification. We think we can better serve you from our China Training site. To access this site please click below.

感谢您对Linux Foundation培训的关注。为了更好地为您服务,我们将您重定向到中国培训网站。 我们期待帮助您实现在中国区内所有类型的开源培训目标。