The same issue for me...
Testing Mandriva packages. Started ksysguard and then glxgears. The memory use increased and, about 1 minute after, the memory extra was about 250 MB and growing up. Closing glxgears freed the memory instantly.
ATI Mobile X1400 on a Dell inspiron 6400.
I can't see the issue here on my ATI Radeon X1200, using the Gnome System Monitor.
I run fgl_glxgears, and it keeps increasing, and then stops at 55.6 MB.
I think it's an R500 part problem from the overall impressions I've been gathering in this thread- the X1400 is a mobile version of the Rv515. You, in spite of the X1200 moniker, have an R400 derivative which operates slightly differently than the rest of the X1xxx series parts do. I really, really detest their insistence on munging up the product space with things like the X1200 being actually an R400 part and so forth. It makes for fun trying to sort out these problems and it doesn't really help with their marketing efforts because the bulk of the people are buying based on advice from people like us, which figure out real damn quick what a card is and isn't.
Originally Posted by Extreme Coder
I'm about to set up a machine with a few differing distributions on it- and an X800 pro and an X1300 card. I'll see if it's distribution specific, device class specific, or can't reproduce with the cards I've got.
Last edited by Svartalf; 11-20-2007 at 02:55 PM.
That is interesting. I've been thinking along the lines of even trying different screen resolutions and sizes of the glxgears window, just for the heck of it. Can't be worse than ATi's bug-testing procedures anyway.
Originally Posted by Svartalf
Happens for me, too - it locks X up (*Lock keys, Ctrl-Alt-F1, Ctrl-Alt-Backspace, etc. don't work), but Magic SysRq key (and, presumably, SSH) works. fgl_fglxgears locked up my system in under 10 seconds, after noticing that it took up a few hundred MB.
x86_84, kamikaze-sources-2.6.22-r9 on Gentoo. I have an X1900GT.
Fun exercises that nvidia users misses out on
Well, I finally got around to play with the size of the glxgears window to see if it had an effect on the rate of memory leak.
I have an ATi X1400 running on a Z61m Thinkpad. I use Frugalware, which still uses 8.42.3 (but as far as I've understood, the memory leak was not fixed in 7.11 anyway).
The native resolution for the laptop screen is 1680x1050 and I toggled the glxgears window between its default size of 300x300, some intermediary resolutions and 1680x1050, expecting it to either eat memory even quicker than before when I upped window size or alternatively at the same rate if the glxgears load remained constant only resulting in fewer fps. As you all know maximizing the size of that window results in lower frames per second count since more pixels needs to be computed by the graphics card.
This is what I got (please note I just simply monitored output of ps aux every 5 seconds for a minute [after I discarded the 5 "burnin" seconds] so figures are approximate):
Window size --- FPS --- Leaked RSS
300x300 --- 3740 --- ~ 14MB/5s
600x600 --- 1310 --- ~ 5MB/5s
800x800 --- 760 --- ~ 3MB/5s
1000x1000 --- 500 --- ~ 2MB/5s
1680x1050 --- 300 --- ~ 1MB/5s
Well... this shows two important things:
1) I do not know much about graphics driver development and infrastructure since my expectations were completely wrong.
2) The bug is indeed related to FPS turnover rate (which in turn of course is inversely related to window size). The bug appears to cause a small memory leak when each frame is updated. Actually, this is easily demonstrated if we look at the FPS count and memory leak over 5 seconds (more detailed figures):
Leak / Frame
300x300: 14289KB / ( 3740FPS * 5s ) = 0.76KB/frame
600x600: 4989KB / ( 1308FPS * 5s ) = 0.76KB/frame
800x800: 2890KB / ( 758FPS * 5s ) = 0.76KB/frame
1000x1000: 1940KB / ( 499FPS * 5s ) = 0.78KB/frame
1680x1050: 1129KB / ( 296 * 5s ) = 0.76KB/frame
The memory leak caused when each frame is rendered is not dependent on frame size but constant.
If we assume that this particular leak would occur in the same way when a frame is rendered in any OpenGL app (that is, we visit the bug the same number of times per frame as we do in glxgears) and I have 1GB of RAM available for Doom3 after it has already started up and an average frame rate of 20 fps during gameplay, I would then be able to play the game:
1048576KB / (0.76KB/frame * 20fps) =
19 hours and 10 minutes
before starting to worry about a crash. Or even longer if I crank up the resolution a bit
Happy holidays everyone and take care
Seriously shouldn't it be possible to actually use glxgears to pinpoint where and when the memory leak occurs using valgrind or some other tool so that we can file a real bug report to ATi?
Last edited by korpenkraxar; 12-21-2007 at 04:07 AM.
Not many of us do...
Originally Posted by korpenkraxar
A few people have done this and it appears that the culprit is a leak in the implementation of glclear(), which would make sense, in terms of the leak rate being directly proportional to framerate.
Originally Posted by korpenkraxar
Yup. Some quick googling returns this: http://ohioloco.ubuntuforums.org/sho...588383&page=23
Originally Posted by happycampers