I get "DKMS part of installation failed" installing them. I never had this problem on previous releases.
Because you use 64 bit, thats a 32 bit only bug
Because you use 32 bit and kernel 3.2.8/3.3+.
[fglrx] module loaded - fglrx 8.96.4 [Apr 5 2012] with 1 minors
Originally Posted by fritsch
Thx very much for the dmesg output.
Pkg build on Ubuntu 11.10
was able to build deb using instructions on cchtml wiki... unfortunately catalyst 12.4 (fglrx 8.96) still does not allow me to achieve 4-way crossfire with my HD7970's. disappointing as tech support at AMD explicitly stated that it would work when i called them.
by the time AMD generates a driver that allows me to do this, i will be 3+ generations behind the new adapter. very disappointing, very frustrating.
anybody know why it's always like this with fglrx? I myself have countless failed installation with fglrx. And the contrary with nvidia binary blob.
why? why? why?
Catalyst has an extremely conservative release schedule. Instead of letting fixes get out to users, they hold back developed code for 3-4 months in development, beta and RC builds before they push it out in a public Catalyst release. Some releases of Catalyst are based on the same trunk version (e.g. 8.96 is a trunk version) as previous months' releases; that Catalyst release would therefore consist of critical bug fixes only, and not new features. They consider Linux kernel support a "feature", so you see where I'm going with this.
Originally Posted by t.s.
Due to the release cycle of Catalyst, code that a developer writes today for Linux 3.4 support is not going to hit the public until at least 2 months from now, but more likely 3 or 4 months. So somebody within ATI is right on top of the latest kernel developments and constantly working on Catalyst to make sure it supports the latest Linux kernel ABI, but... these changes take months and months to get through their release cycle!
If you were to compare the release cycles of Linux distros to the release cycles of graphics drivers, you'd have something like this:
Catalyst = Red Hat Enterprise Linux or Debian Stable. Releases very infrequently with heavily "tested" / "QAed" code, with the goal of minimizing regressions. The good part of this model is that, if your driver is already working well and you upgrade to the new version, it isn't going to make anything worse. It will be unlikely to introduce new problems, assuming the environment it's running in is static (this assumption is the primary reason why the Catalyst release cycle clashes horribly with the Linux release cycle to produce the current situation. Linux is anything but static, and any software that wants to keep up with it absolutely must make frequent releases of breaking changes.) I'll call this model the Conservative Catalyst (following Ubuntu's naming convention of adjective and noun starting with the same letter, just for fun).
Nvidia binary = Fedora development branch or Arch Linux. Releases more frequently, with somewhat less testing and QA, with the goal of getting the latest and greatest stuff out to the users once they're fairly confident that it "works". The driver's architecture is more volatile than Catalyst's because any given release of the Nvidia binary can happen at any time. The release schedule is not as heavily based on time intervals, so much as releasing what they have, when it's ready and working for most people. The upside of this model is, if you have a bug or a feature you want, you will get it faster. The downside is, you are more likely to see regressions and new bugs crop up with each release. This process forms a middle ground between Conservative Catalyst and Liberal Libre, which I'll call the Solid Shipper, in homage to the fact that the Nvidia binary releases are generally "solid", if not timely.
Open source drivers = compiling packages from version control and building your own distro a la Linux From Scratch (LFS) or Gentoo ~x86. There is little to no coordinated QA; releases are arbitrary lines in the sand rather than representing a finished product; and you're free to shoot yourself in the foot by pulling the latest code from upstream, which gives you both the latest features and the latest bugs. The huge advantage of this model is flexibility: those who want the latest and greatest can have it; those who want something more stable and tested can perform their own QA (as distros that ship released versions of Mesa often do) and try to work by a development model similar to Catalyst. So you have the full spectrum of flexibility in how "old and crusty / tested" you want your packages: zero testing, a little testing, or a whole lot. It's up to you, and/or your distribution maintainers. This is what we know as the Liberal Libre, where anything goes. There's no downside to this model except that the proprietary driver companies aren't using it, forcing their users to endure one of the previous models.
What it comes down to is that the only optimal development cycle that would suit the Linux ecosystem would be another open source project, not these proprietary projects with their own out-of-sync release schedules.
Well, I'm on ubuntu 12.04 x64, my video card: ati 4770, I have three bigg issues:
1) catalyst seems to forgot my options, I managed to get dual monitors working (with different resolutions even!) but after I reboot, it forgets my configuration... realy annoying
2)This is the worst of the three, with fglrx I get choppy sound, scrolling on firefox, chromium or even nautilis file manager, it makes the sound get all choppy(does not happen with the FOSS)
3) I managed to get flicker free videos with the amd catalyst option, but, to be honest it makes all the desktop go a little choppy, isn't any way to make vlc activate that option on full screen, and then deactivate it on exit?