There are a number of significant differences between Fedora and Debian stable, some being a much older kernel, a much older X version, and older libraries, and some libraries installed with different configurations and options. They even use different boot/init procedures. We're getting into fuzzy areas here, but I take a wholistic view of an operating system as a kernel, associated userspace tools, and base set of provided libraries and userspace applications. In this case Debian stable and Fedora are rather disparate. Fedora and RHEL5 even more so. There is no expectation that everything you want to even compile on Fedora would do so on Debian stable without some intervention, and good luck getting all your bells and whistles in Fedora working on RHEL5.
Even distributions with the same packaging system aren't necessarily compatible. Mandriva rpms don't necessarily work on Fedora boxes, and Ubuntu debs won't necessarily install on Debian boxes. Packaging standards, while useful, aren't the panacea that you believe it be. Even as it is, you can convert rpms to debs on a Debian platform.
In this sense, it makes sense to think of a distribution as an operating system because they each ship with different kernels (versions, patches etc), different userspace tools, different libraries, and different userspace applications. Sure, some may be compatible, some of the time. But that is hugely dependant on which ones you are comparing.
Ok firstly, let me say I personally think of standards and stable/consistent APIs not necessarily one and the same different things. To me a standard is a collection interfaces and/or best practices maintained for the interests of interoperability.They're called standards. A web browser uses HTML standards. A FTP client uses FTP standards. FreeDesktop.org, the Linux Foundation, and other groups help create standards so that when you're in KDE and you want to run a Gnome app, for example, you can still do so. It empowers Linux users to communicate on the same page at certain points. It does NOT hamper progress, because if someone feels that the standard needs to be broadened, it can be, so that new features can be added. What version of HTML are we up to now, five? OpenGL 3? Those are standards which are quite solid, and they help everyone.
Just imagine a world without any standards, where everyone did just go and do their own thing. .....
HTML is a standard to allow interoperability between web-browsers.
OpenGL was presumably created to provide a way for programmers to speak to variety of different hardware devices in a uniform way.
In that sense, yes, I agree that things like package management and desktop integration should be standardised. These are domains where interoperability between packaging formats or desktops implementations are important.
I fail to see how standardisation would help the kernel though. A controlling group already sets it's direction. And what exactly does it need to interoperate with? Are you proposing some UNIX Kernel standard that requires all Unix kernels to be compatible with each other somehow? Or is this for device drivers only? Are you proposing a Unix Device Driver Standard? How realistic is this, and how neccessary? POSIX, for example, already provides a cross platform way to access various devices, so do we really need to further generalise the implementation?
Standards can add an unnecessary layer of fat to the development process. OpenGL is a pretty good example. It's a stagnant standard, and part of the trouble is its inertia. One of the problems facing OpenGL 2.0 was the maintaining backwards compatibility. This came a cost of implementing newer features, and was a pretty big disappointment to the OGL community. OpenGL 3.0 was promised to deliver some heavy improvements, but it was more of the same really. Maintaining a single framework to support legacy compatibility came at the cost of added features, performance, and usability, and OpenGL is all but relegated as the "cross-platform" option. D3D on the other hand, while not an official "standard" is almost a de-facto standard and has pretty much won the war, due to it's manoeuvrability and flexibility. People would get pissed of if OpenGL 4,5,6,7,8, and 9 changed the API every release, but they are currently pissed off that OpenGL 2 and 3 didn't change it enough.
Now, if you are commenting more on the stability between kernel versions I feel my point still stands. Yes, ideally, updated kernels shouldn't be continually changing interfaces, but sometimes this is neccessary. Maintaining legacy compatibility at the cost of significant architectural or performance improvements is pretty much the OpenGL trap. To be fair, I don't follow the driver side too much, but yes, I do remember when they updated the wireless stack, leaving me using flaky experimental drivers. That being said, I don't the kernel guys are anti-standard or anti-interface stability layer at all. AFAIK they are pretty POSIX-friendly (look at the ext4 discussions) and layers like the VFS are pretty stable (cue Rieser4 flames). And from what I understand, the stuff happening now with in-kernel graphics drivers will also provide a more standard way for other display servers to access graphics hardware.
Well expect compatibility between X/kernel versions and drivers to not be great until the new graphics stack settles down. It's not like they can work on it for 4 years then release Linux 7. They release stuff incrementally, and when structural changes are made for benefit of the system, things will break. This is unfortunate but a reality. It IS a bit of a mess, but it has less to do with lack of standards, and more to do with the previous stagnant state of Xfree86. It's only been pretty recently (2005) that the graphics stack has started to undergo improvement, so most of us expect a little pain.It doesn't even work on the major distros, that's one of the points of this thread. I had to downgrade to kernel 2.6.27 and xorg 1.5 (using the Ubuntu Intrepid software bunch) I assume it is since fglrx won't work in the newer versions yet with my 4850x2.