Linux 3.15 SSD File-System Benchmarks

Written by Michael Larabel in Software on 8 May 2014 at 04:00 AM EDT. Page 2 of 2. 8 Comments.

While running AIO-Stress on the Vertex 3 SSD seemed to be getting into some system memory caching, the EXT4 and F2FS file-systems reported improved results with the Linux 3.15 kernel over Linux 3.14. Meanwhile, Btrfs and XFS were slightly lower with their readings compared to Linux 3.14.

With Dbench there wasn't too much to see but F2FS from Samsung was the fastest. However, Samsung's Flash-Friendly File-System reported slightly too fast results than what would be expected of Serial ATA 3.0 solid-state drives, so compared to the other file-systems may be waiting to commit some data to the disk.

With the random write Threaded I/O Tester performance the F2FS performance seems to seem degraded with the Linux 3.15 kernel compared to Linux 3.14 stable; the other file-systems appeared unaffected.

With the PostMark mail server disk benchmark, the F2FS file-system edged past XFS for being the fastest.

For those curious about the file-system changes made with Linux 3.15, see our many Linux 3.15 articles covering the merge differences. There's also other benchmarks hosted on OpenBenchmarking.org. Similar benchmarks from a hard drive system are coming shortly.

If you enjoyed this article consider joining Phoronix Premium to view this site ad-free, multi-page articles on a single page, and other benefits. PayPal or Stripe tips are also graciously accepted. Thanks for your support.


Related Articles
About The Author
Michael Larabel

Michael Larabel is the principal author of Phoronix.com and founded the site in 2004 with a focus on enriching the Linux hardware experience. Michael has written more than 20,000 articles covering the state of Linux hardware support, Linux performance, graphics drivers, and other topics. Michael is also the lead developer of the Phoronix Test Suite, Phoromatic, and OpenBenchmarking.org automated benchmarking software. He can be followed via Twitter, LinkedIn, or contacted via MichaelLarabel.com.