The word "fragmentation" shouldn't be used unless you are going to define what fragmentation is. The classic definition for "fragmentation" is that different versions of the system are developed that are binary incompatible on the same type of hardware. According to this definition, Linux is not fragmented at all. Incidentally, package manager incompatibility is not the same thing as binary incompatibility. To take a case in point, you can download the same binary tarball for Firefox from the Mozilla site and run it on whatever distribution you wish as long as all of its dependencies are met (you may have some issues with the location of the Flash plugin or things like that).
Of course, when Miguel de Icaza uses the word "fragmentation" he is apparently talking about something different, but I can't tell for certain exactly what.
Incidentally, as I have always heard it, "dependency hell" primarily refers to the phenomenon of two different binary programs requiring different conflicting versions of the same library. In other words, it's pretty much exactly the same thing as "dll hell" in Windows, and methods for dealing with it can even be similar. I have seen both of these types of "library hell" in my time on computers, but not often in either case (both used to be more common than now).
People seem to imagine, or to want to convince others, that binary program installation is somehow entirely different on Windows or Mac than it is on Linux. In reality they are much, much more similar than they are different.
For the most part the difference is a difference between the software being open source and freely available, and closed source and paid for. In Linux, binaries for open source programs don't include all the dependencies because they are probably already available or easily installed from the distribution repositories. In Windows, only standard Windows dlls are expected to be in place, and libraries needed by a new closed source binary are included with it, and often installed in its directory to avoid conflicts (dll hell).
What people don't seem to realize is that this difference has nothing to do with Windows or Linux. If you install a Gtk program in Windows and you don't install Gtk, then you'll find the program doesn't function. The pieces for these open source programs tend not to be spread out too much for Windows because none of them are generally expected to be there already. In Linux, if you install a completely proprietary, closed source program, you will find that it often includes all its own dependencies, just like it would in Windows.
Also, there is package management. Microsoft thought that package management as found in Linux distributions was such a good idea that they created a package manager for Windows (although it seems quite inferior to the major Linux package managers with all the stuck software and bugs). Before that, all Windows software was installed using installation scripts, like those from Wyse or Installshield. Guess what; installation on Linux can also be done with installation scripts, and a number of proprietary applications use them (including a number of games). The only real difference here is that now, there is just one package manager for Windows (even though it stinks), and a fair amount of installation scripts still, while there are several different package managers for Linux, and installation scripts are not used as often.
One thing I will say for Windows installation management is that the system does a fairly good job of tracking scripted installations and their uninstallation scipts and making the uninstallation option appear transparently in the same interface as actual MSI packages.
People who are calling for unification of effort in Linux are apparently out of touch with reality. Open source software was created to allow users and developers (it's freedom for both, by the way, and perhaps particularly the freedom for users to become developers) the freedom to do what they wanted with the software and get what they needed out of it. The current success of Linux is based on this possibility. It would never have gotten anywhere near where it is without it, so it's really rather ridiculous to curse the thing that made Linux possible as what's holding it back. The duplication of effort, the freedom to start yet another text editor project (and believe me, there are a lot of them) is what made Linux possible and successful. You can't have it both ways.
Apparently, there are those who think that the Windows interface (or at least those prior to 8) and the Mac interface are superior to what's available for Linux. I don't find that to be the case. Of course if I did, then there are still choices in Linux that do a fairly good job mimicking existing Windows or Mac interfaces, so I wouldn't be completely out of luck on Linux. For me, though, the interfaces I do use for Linux are infinitely superior to what's available for Windows (I can't fairly judge Mac, because I haven't really used a Mac since OS 9 or earlier). Part of the reason for this is because the features that I want are available. Part of it is that I can tailor the interface to suit the hardware. I can use Fluxbox or Openbox on old machines and have those machines perform reasonably well. I can put Xfce on newer hardware if I want (and it doesn't have to be all that new). I also have the option of KDE or GNOME if I want it, though I have never found these desktops particularly desirable (though some certainly have). Every attempt I've seen to duplicate the multiple workspace environment I use in Linux for Windows ends up being a pale shadow of the real thing, generally more annoying than helpful.