Considering FPS of ~100 - 110 it would be safe to assume VSync is off.
Originally Posted by nanonyme
Vsync wouldn't make Gallium stick to 20 FPS either.
The conclusion is:
1) Gallium is experiencing FLAT lines
- This means it is being maxed out by the CPU in those tests.
2) Gallium starts with a lower frame rate than Mesa but fares better at higher resolutions.
- This means there is a lot of overhead (CPU?) but it scales well.
1) r300g uses a SLOW (at least until very recently) flush+sync (glFinish) implementation since the current implementation of swap&sync is still considered experimental.
2) I'm not sure whether color tiling is automatically enabled on r500. It should give a nice boost.
I'll be happy if a dev can correct/complete me.
However there is the exact same slow implementation in r300c.
Originally Posted by kbios
Yes, it is, but you need latest xf86-video-ati from git.
Originally Posted by kbios
I guess I should figure out why it fails so badly in Tremulous and Smokin' Guns.
well to be fair is impressive already how fast it got compared to march in only 3 months, but ppl remember that mesa 7.9 is under constant heavy development, which could lead to some issues until the code stabilize, for example:
Originally Posted by marek
* many operations internally in libgl could be only partially implemented(aka remember it could be several algorithm for 1 function wich optimize the render depending on X number of variants).
* some operations could be handle rigth now through llvmpipe until the proper code for the gpu is optimized
* that game could be using an gl extension not implemented yet so it fall to llvmpipe or it generates an error and fallback
* mesa 7.9 still hasnt reached the level for optimization of the render code cuz they still are focused on gl2.1/glsl features, so many algorithm can just be putting there up to the point it renders properly if so it's stays like that for waiting for further optimization when they reach that level
so conclusion, is a very early code yet(mesa 7.9 isnt even in alpha state) so many many stuff can go wrong right inside libgl but even that way they are showing an outstanding progress nonetheless.
lol if they make the driver render at 50% of the speed of fglrx or even more in mesa 7.9, hell why use fglrx after that?(if someone wanna answer just point stuff that actually works 100% of the time without glitches/slowdown/poor quality, i made it really hard, didin't i?)
Well all I can say is that I'm very happy whith the way Gallium3D radeon driver is improving. I'm using it for about a week now and I don't care so much about the games as I do about desktop performance. And with Gallium KDE/KWin desktop effects sure do run a bit more fluidly. The two other 3D apps I care a lot are Stellarium and Blender. I didn't have time to test those two with Gallium yet, but I'll see after I finish with the exams. So keep doing the great job guys!
If this is meant to show how much the Gallium driver has improved, I would think it would be beneficial to put on the graph a curve that represents the performance of the Gallium driver in March (for those of us who are visually inclined).
And definitely I also think that a catalyst curve would be useful as well, to show how much potential improvement there is left (roughly, recognizing that optimizations get harder to come by the better we get)
Yeah, let's go with a 10x slower software rasterizer, that'll sure make a huge difference... No, really, you can't beat hw with a software-based solution, even with llvmpipe, no way. The hardware is damned fast. I think we're CPU-limited and I have an idea how to improve it...
Originally Posted by jrch2k8
Using llvmpipe for SW emu in areas is not really feasible on a non-tightly-integrated IGP (i.e. anything that isn't Llano or Sandy Bridge)
I'm very impressed with how r300g's coming along! Keep up the good work!
Also very impressed with the progress
I'd kind of like to see how Gallium3D comes up against the actual hardware potential, so if you could throw the Windows and Xorg Catalyst drivers in the comparison, define the best performance of any driver at every resolution to be the estimated hardware capacity and define Gallium3D performance in terms of a percentage, that'd be cool.
Aren't the fps's with VSync 120, 60 and 30? (or am I recalling completely wrong?)
Originally Posted by Sacha