I've been using the latest Git-sourced versions of radeon, mesa, and libdrm and they've largely been working great, but X fails to detect all video resolutions my monitor is capable of, including my preferred resolution (1280x960). This was irritating until I found the "IgnoreEDID" option in xorg.conf.
However, I just recompiled my kernel with the "Enable modesetting on radeon by default" option set. Works fine (well, sometimes) except I'm stuck with the same limited number of resolutions again. "IgnoreEDID" is itself completely ignored.
Any ideas what the explanation (and resolution, hopefully) is?
So it works, but there's one thing that has me worried. I use a CRT monitor. Although I used 'gtf' to generate an entry using the same horizontal resolution, vertical resolution and refresh rate that're detected with IgnoreEDID (as well as by Windows), *something* is different, since I have to fiddle with the monitor controls again (size, position, pincushion, etc.). Do I have any reason so worry that this indicates that it's set wrong and therefore could harm my monitor?
It's not likely to harm your monitor, but if you wanted to be sure you could probably look at the logs when running UMS and find out the exact timing entry (presumably one of the X server defaults) that you were using before, then tweak your new settings to match.
IIRC near the front of the log there's a dump of all the available modes with timing info (each mode has a name string plus a list of numbers), then later on there's a couple of cryptic messages telling you which of those modes was selected (via name string).
windows may use cvt to generate the mode. You might try a cvt made rather than gtf. If the mode sync ranges and clock are within the limits of the monitor it should be fine. the adjustments you need to make to your monitor are due to slight differences in the mode timing, probably due to gtf vs cvt.