Okay, I have seen all of the video and I see some "general problems" with some viewpoints, though some things are accurate.
1) Yeah, it would be nice to have some "general framework" for audio/video as frontend for developers to rely on. Though this is basically what KDE does try with phonon where it provides one framework to KDE devs so that they don't have to fiddle with the internals. Yes, you can easily argue if the "oh my god, another layer" is the correct step and I am not sure at all...
2) "Packaging has to be unified so that only one standard exists." Hmm, hard thing... Especially because the way he arguments is strange, he puts binary packages (.dep and .rpm) into the same same bin as "build descriptions" for Gentoo. This is, uhm, strange/wrong/whatever. Though in general it would of course be nice to have some "standard" specified for packages. Though this does not work simply because the package files also contain *dependency* declarations. And those differ from distribution to distribution (layout of packages, different ones included, versions, ...). In general he claims that commercial vendors have the problem providing stuff simply because every distro does it differently. Personally I have to agree with someone from the audience: The company should provide some installer with some basic functionality (working from commandline *and* graphical!). This installer can a) be used directly by the user and b) (due to commandline hooks) be used by the distribution to write some hooks to easily write "pseudo packages" that do the installation of the programs.
So what basically would be needed is some "standard installer" providing the hooks to allow the user to easily install software and allow the package managers (and maintainers) to include it in the official trees.
3) Yes, huge updates like the half year updates Ubuntu does can *easily* break the system. In general my own experience is that rolling release distributions tend to have less of those problems if the user updates regularly. The speaker asks the question if those updates really have to include all the updates they do and I say: Yes, they do have to include *most* of the things. At least the kernel updates are needed because those bring new drivers to support new hardware as well as lots of security fixes. Though X.org updates are *problematic* since they are difficult to test on a wide base due to the lot of different configurations possible when it comes to graphics cards and the features in the drivers.
In general many distributions should maybe try to go some more conservative way for endusers to ship stable distributions and those interested in the bleeding edge (as well as willing to face occasional problems) can use the "unstable" version with more up to date packages in various areas. What is most important is that distributions for "end users" should listen to the creators of the packages that are shipped. When they say "experimental, not meant for endusers" (remember KDE 4.0?), the distributions should keep this in mind.
4) Lack of software is a difficult problem. He has not really provided a working and "good" solution either. This is basically some "Hen and Egg"-problem. Eg Adobe does not see numbers that users would buy Photoshop if it was available for Linux, so they fear that they won't make a profit with it. Unless they see a (good) chance for a profit, they have no reason to port it, but how should users show that they buy the software when there is not much available? I imagine that Adobe does see what happened to Ahead with Nero: After *many* years without any Linux support they shipped some "not too good" burning software (Nero derivate, *not* on par regarding features with the Windows version and buggy!) and users did not buy it because of a better free alternative (k3b).
Though the question remains: can the users themselves really change something about this? I say that they can only vote with the wallets and only buy stuff that provides native Linux versions *and* tell the vendors that you would be interested in buying there stuff but only if there was a native Linux version available of it. And running the software via WINE or some virtual machine is not a good solution since this way companies do not see the need of the users, they just see "hey, it sells well for Windows".
In general I think this presentation was not really saying too much. It was especially not providing something "normal users" can really do. He was talking mainly about packaging which is not this much of a problem as he made it appear since the vendors should *not* provide distribution specific packages but only some "generic" packages that the distributions can easily adopt. Yeah, those packages/installers should probably be done in some more uniform format so that it is easier to integrate them into repositories, but that's basically it.