Compiz-Fusion "nvidia" Vs "fglrx" performance comparison (please?)
I was wondering, what would the performance delta on similar hardware be for AIGLX with the "nvidia" binary driver, the "fglrx", and probably "nouveau" and "radeon" drivers.
Doing a subjective comparison between the performance of my nVidia desktop setpu and my girlfriend's ATI based laptop, performance (as well as CPU utilization) is very unequal. On my desktop with my soon-to-be-replaced FX 5900 and driver 100.14.19, not only performance seems to be better, but this setup seems to also utilize less CPU. Now the laptop doesn't have a "good" graphics adapter, according to the Catalyst Control Center, it is a Radeon X1100 IGP, apparently on the PCI bus (I would have expected it to be PCI-E, but apparently it isn't), running the 8.42.3 drivers. With these setups the performance difference is abysmal with stuff like the water effect (~12 FPS with the laptop and 60+ FPS in the desktop with Beryl set to sync to vBlank). Also worth noting is that the laptop's CPU is orders of magnitude faster than my desktop's and that on my desktop I'm running 64-bit Fedora 7, and the laptop is running 32-bit Fedora 7 as well. For pretty much regular desktop use, the laptop is actually very comfortable, and since you can change on-the-fly (with full features/performance gain) the window manager between Beryl or Metacity (for example), it is not much of an issue for stuff like mild 3D games... However, current lack of Xv in the fglrx drivers with AIGLX, though not a problem (switching between Metacity and Beryl is, again, not a problem), is kind of annoying. Not even stuff from youtube would play show while running Beryl on the laptop.
At any rate, I'd like you guys at the test labs, to throw together a head-to-head nvidia Vs fglrx comparison, throwing into the mix the 2D comparison you ran for the open and closed source ATI drivers, as well as gaming, AIGLX, features, etc.
Couldn't resist, did some testing on my machine and to my amazement, the water effect had little impact on the CPU usage, while the FPS are kept rock solid stable above 60FPS... And to think that by today's standards this machine is well bellow the low-end. Granted the GPU still has more power (if not features) than many low-end/mid-range current GPUs...
I was pretty much amazed to see this:
Warning: Linked image is a high quality .png file weighing 1.3 Mb
Not bad for an "obsolete" system huh?
I remember that an old GeForce MX 400 with the nvidia binary driver, ran compiz much smoother then a much "better" ATI 9600 XT with the first fglrx driver to support AIGLX.
Last edited by Betel; 12-20-2007 at 10:09 AM.
Originally Posted by Thetargos
Your comparison itself falls apart from the very first step: you're comparing an IGP to a discrete desktop GPU (ignore the simple speed differences from the cards). That issue alone will have a drastic effect on any test. For one, the added CPU usage very likely comes from the fact that one of the chips is an IGP. Integrated anything will always be worse (in broad terms) than a discrete part. It should also be noted that the 5900 is not that much faster than the x1100 (the x1100 is nothing but a x300 I believe).
You also have the OS difference; one is 32bit and the other 64bit. 64bit runs much better on all my machines, specially when it comes to GPU operations.
Better to compare either two IGPs or two discrete GPUs. I can't since I don't have any nvidia parts (on my linux machines). I can attest to the issues of current fglx drivers, like lack of proper overlay with Compiz (that alone drives me insane).
I can also give you my own numbers. On my laptop (2GHz CPU, 1GB RAM, 200m IGP) I can run Compiz w/ AIGLX using the latest drivers (7-11) fairly well, with the benchmark giving me an average of 20fps. My x1950xt, forget about it; it eats Compiz alive, but it still suffers from the above issues I mentioned.
I know I was comparing oranges to apples, and that's why I started with "I was wondering, what would the performance delta on similar (AKA comparable) hardware be for AIGLX with the "nvidia" binary driver, the "fglrx", and probably "nouveau" and "radeon" drivers."
However I later ran a test on my motherboard's IGP (Unichrome [Pro]) and the results though not as good as the GeForce, were indeed faster than the X1100, CPU utilization was also higher than with the GeForce, but not as high as with fglrx. Odd.
compiz with fglrx is unusable today, so its not comparable.
GeForceFX5200 runs much smoother than RadeonX1600XT - sad
yeah, just got a nvidia geforce 8600gt, seems will faster than ati 2600pro in quake4 benchmark.