I'm not sure what is this due, but it has been a while since I saw something like this on an ATi based graphics card. The last time was with an R300-based Radeon 9500 (when the fan died on it), or similarly when I tried to soft-mod it to a 9700 (half the texture units were not functional, so checkboard effect).
As the title for this thread reads, my problem is about image corruption... These two images are a photograph of the screen when the corruption happened, and a screen grab with gnome-screenshot while the screen was corrupted.
I'm not sure if this is due to the driver itself, or the package I installed, I got the drivers from Livna on Fedora 8, and besides this corruption problem with 2D (as you can see in the console output and the screenshot of Regedit, DirectDraw is set to be rendered through OpenGL, but for some reason it is being rendered indirectly. And the screen gets corrupted.
Another issue is that Direct Rendering applications seem to crash upon exit (segmentation fault)
I'll try to replace and set as much as I can all the files as they should be (i.e, no previous driver installation leftovers, and such) and test again.
So I feel I have to ask, anyone else seen this with 8.6?
Actually I did use the search, but apparently didn't use the right keywords, as indeed your problem pretty much is the same as mine... However I did forget to mention that the corruption didn't only happen with Wine (or DirectX emulation), but also with other GLX applications as well. I'll follow your thread to see what other have said about this.
Am I reading the error messages correctly ? Is Wine trying to set 8bpp ? If so...
Yes it is. Those particular console messages were produced when trying to run starcraft, which is "locked" to 256 colors (8bit color) with DirectDraw. That is one of the reasons why it is generally advised to set DirectDrawRenderer to OpenGL in regedit for running Starcraft, supposedly this cures the speed problems (from having to convert in software from a higher bit depth to a much lower one) and since OpenGL doesn't really care about the bit depth of the "textures", the whole GL context would be 32-bits, even if the textures are 8-bit. At least that's what I remember reading in the AppDB forums for Starcraft. Sure other programs don't have this restriction