NVIDIA GeForce GTX 1060 To RTX 4060 GPU Compute & Renderer Performance On Linux

Written by Michael Larabel in Graphics Cards on 10 July 2023 at 02:52 PM EDT. Page 1 of 7. 17 Comments.

Earlier this month I provided some initial GeForce RTX 4060 vs. Radeon RX 7600 Linux gaming benchmarks for this new sub-$300 graphics card. For those considering this latest Ada Lovelace graphics card for 3D rendering or compute purposes, here are some benchmarks of the GeForce RTX 4060 on that front by looking at the generational performance of the x060 series graphics cards from the RTX 4060 back to the GTX 1060.

NVIDIA GTX and RTX graphics cards

Today's benchmarking is looking at the GPU compute performance and Blender rendering performance of the following graphics cards:

- GeForce GTX 1060
- GeForce GTX 1660
- GeForce GTX 1660 Ti
- GeForce RTX 2060
- GeForce RTX 2060 SUPER
- GeForce RTX 3060
- GeForce RTX 3060 Ti
- GeForce RTX 4060

As with the earlier gaming article, the GeForce RTX 4060 used was the MSI Ventus GeForce RTX 4060 8GB that had to be purchased retail with NVIDIA not having supplied any RTX 4060 series hardware to Phoronix for Linux testing.

NVIDIA GTX 1060 vs. RTX 4060

Just the generational NVIDIA performance is being looked at for this article with the AMD Radeon RX 7600 series not yet being officially supported by the ROCm compute stack and some of these benchmarks only supporting NVIDIA hardware.

NVIDIA GeForce RTX 4060 Compute Benchmarks

All of these graphics cards were tested on Ubuntu 23.04 with the Linux 6.4 Git kernel while using the NVIDIA 535.54.03 proprietary Linux graphics driver. In addition to looking at the raw performance results, the GPU power consumption was also monitored in real-time for providing performance-per-Watt metrics too from the GTX 1060 through the RTX 4060.


Related Articles