Intel Cloud-Hypervisor 0.9 Brings io_uring Block Device Support For Faster Performance

Written by Michael Larabel in Virtualization on 8 August 2020 at 03:27 PM EDT. 3 Comments
VIRTUALIZATION
Intel's Cloud Hypervisor focused on being a Rustlang-based hypervisor focused for cloud workloads is closing in on the 1.0 milestone. With this week's release of Cloud-Hypervisor 0.9 there is one very exciting feature in particular but also a lot of other interesting changes.

Most interesting with Cloud-Hypervisor 0.9 is IO_uring block device support, which works when the host kernel supports this modern Linux I/O interface. IO_uring has shown much potential and incredible adoption over the past year for faster asynchronous I/O on Linux by allowing ring buffers to be shared between applications and the kernels and other improvements with less kernel overhead.

With Cloud-Hypervisor 0.9, the hypervisor can make use of IO_uring for its block device implementation and it results in "a very significant performance improvement." Also on the performance front is Cloud-Hypervisor's release build is now built with Link-Time Optimizations (LTO) enabled for possible performance benefits but also around a 20% reduction in binary size.

Cloud-Hypervisor 0.9 also provides block and network device statistics, configurable CPU topology support, better snapshot/restore handling, VirtIO memory ballooning abilities, improved ARM64 support, Intel SGX support, and a variety of fixes and other improvements.

More details on this Intel-backed Rust-VMM project via GitHub.
Related News
About The Author
Michael Larabel

Michael Larabel is the principal author of Phoronix.com and founded the site in 2004 with a focus on enriching the Linux hardware experience. Michael has written more than 20,000 articles covering the state of Linux hardware support, Linux performance, graphics drivers, and other topics. Michael is also the lead developer of the Phoronix Test Suite, Phoromatic, and OpenBenchmarking.org automated benchmarking software. He can be followed via Twitter, LinkedIn, or contacted via MichaelLarabel.com.

Popular News This Week