Running The RadeonSI NIR Back-End With Mesa 19.1 Git

Written by Michael Larabel in Display Drivers on 10 February 2019 at 12:05 PM EST. Page 1 of 4. 23 Comments.

It's been a number of months since last trying the RadeonSI NIR back-end, which is being developed as part of the OpenGL 4.6 SPIR-V bits for this AMD OpenGL driver, but eventually RadeonSI may end up switching from TGSI to NIR by default. Given the time since we last tried it out and the increasing popularity of NIR, this weekend I did some fresh tests of the NIR back-end with a Radeon Vega graphics card.

The RadeonSI NIR support isn't enabled by default but requires setting the R600_DEBUG=nir environment variable for activating. They have been pursuing this support to re-use existing code as part of the long-awaited OpenGL 4.6 SPIR-V ingestion support, which is still ongoing.

The last time I tried out RadeonSI NIR months ago it was causing issues with a few OpenGL games, but fortunately that seems to be an issue of the past. When trying all of the frequently benchmarked OpenGL Linux games with RadeonSI NIR on Mesa 19.1-devel, I didn't run into any game problems or any corruption problems or other nuisances to deal with... The experience was great.

OpenGL RadeonSI NIR

This round of testing was with Mesa 19.1-devel via the Padoka PPA on Ubuntu 18.10 and using the Linux 5.0 Git kernel. The Radeon RX Vega 64 graphics card was what I used for this quick weekend comparison.

Besides being pleased with running into no visible issues when using the NIR intermediate representation by RadeonSI Gallium3D, I also ran some benchmarks comparing the stock behavior to the Linux OpenGL gaming performance when taking the NIR code-path. Benchmarks were done using the Phoronix Test Suite.


Related Articles