Is that in game? I can't get past the title screen, all I see is the background. The textures on the planet are SLIGHTLY glitchy.
Originally Posted by pete910
I have a Radeon 4870, Catalyst 10.6. Game works fine on my laptop (nvidia), but my desktop is the faster machine (and I have no convenient place to use my laptop outside of work).
I've tried Shadow Grounds and that works fine (except character portraits are glitchy, but this has always been the case).
ATI's drivers are notoriously picky about how they're used within the context of their OpenGL state engine. There's lots of places where NVidia's drivers will substitute things on missing values for their shaders, or handle "bogus" parameters passed into a call a lot more gracefully than ATI would. Same goes for D3D, actually. It's where their reputation of being "unstable" on their drivers actually mostly comes from.
Not that the studios are in the right when they do something "off"- and it happens often.
If it were me, I'd log a bug through normal AMD channels and try to log a support ticket with LGP to get the ball rolling on this one.
I guess the obvious question is whether propogating implementation-specific quirks from one vendor to another is the right solution (which basically means changing the OpenGL spec without actually documenting anything), or whether the missing/bogus parameters should simply be fixed in the app ?
Depends on the situation. I think you'll find me wanting someone to fix their busted game, myself- but there's a place where that may not work well. For example:
Originally Posted by bridgman
If the implementation of the state engine does comply with the spec, and the vendor specific quirk that ALSO complies with the spec (e.g. anywhere you see a "may" in the mix... You do it one way, they do it another- the quirk of their operation allows the game to work as intended...)- what do you do? In that context, you're going to have a difficult time (moreso than you'd usually have- which is...entertaining...as you well know...) trying to convince someone like BioWare or Valve of NOT doing it the way they did it, even if it was mildly stupid.
Another example would be of a top AAA title still selling but the studio's gone poof on everyone- it's a situation of you're being unlikely to get someone to fix their busted shaders or GL implementation. There's a few of those lying about. There's also studios that're going to not bother because "it works on NVidia!".
Do you hack up your driver to make it work or let people bitch about your drivers being broken (even though they're not...)? I think we both know the answer to that one.
There's a good reason I defend your team and the Windows one. I've been there, done that- and it does little good to smear you guys like others do in the varying web forums and elsewhere.
Yeah, I think we would try to support apps which met the GL spec, including the "may" parts; it's just the out-of-spec apps that are harder to justify.
We used to also hack the drivers to make out-of-spec apps work but that quickly turns into a no-win situation all round, and after 4-5 years of making out-of-spec apps work you basically have to toss the driver and start over
How do I get a support ticket going with LGP? Is there a specific web form to fill out that I'm not seeing, or do I just start pinging firstname.lastname@example.org?
Originally Posted by Svartalf
Where do I file bugs with AMD? Google is only giving me sites for filing bug reports with various linux distribution vendors.
That's basically it, see http://www.linuxgamepublishing.com/support.php?
Can't say I've had much luck reporting bugs to LGP, but I guess they're just overworked.
Thanks. I sent LGP an email. Hopefully this will start a dialog where I can provide them with any further information that they need.
Originally Posted by whizse
I've also found an issue where x3 ignores the hints provided by ld.so.conf and loads the incorrect libGL.so.1. This is an ubuntu issue (perhaps other dists are affected as well?) and you can get around this by running x3 in one of the following ways:
LD_LIBRARY_PATH=/usr/lib32/nvidia-current x3 # Change nvidia-current to fglrx if applicable.
x3 -g /usr/lib32/nvidia-current
In ubuntu, /etc/ld.so.conf.d/* gives priority of /usr/lib32/nvidia-current over /usr/lib32, but x3 doesn't seem to care unless you force the issue.