While NVIDIA's proprietary driver for their GeForce/Quadro hardware still lacks RandR 1.2+ support (that will hopefully change when RandR 1.4 is finally out), NVIDIA has proposed extending RandR to support over-scan compensation. This support isn't for their mainline NVIDIA binary driver but rather their TEGRA Linux driver...
This would be a great feature if it was supported across the board with xrandr. I have a computer on my tv as a media center, and xbmc can compensate for the overscan but most applications can't and desktop environments (gnome, kde) are very fond of screen edges. I thought it should be something that was in xrandr, but couldn't find any way to do it.
While yes, the other drivers have managed to implement underscan support, the glitch is that radeon and intel have done it differently. What I mean by that is that the method is the same, but there is no standardization in the naming of the property to set.
Further, in neither case is it actually apparent what the actual property names are. At least not by looking at the xrandr documentation.
As much as I despise nvidia, Plattner has a point.... that it does, in some ways, make sense for xrandr to have a more standardized approach to setting underscan that can be implemented by all the drivers.
That being said, underscan is obsolete and should be allowed to die. Most LCD TV's support native resolution or (in the very least) implement underscan themselves. For the odd TV that does NOT, strong words should be delivered to the manufacturer along the lines of "WTF you stupid douchebags??!?!?! Get with the program!!!".
I wouldn't worry too much about implementing support for ancient TV's.