GPU scaling, resolutions
I am running Catalyst 11.2, 8.78.3-100920a-105558C-ATI, with a Powercolor HD5850, on Ubuntu 10.10 (Linux 2.6.35-25-generic).
Hooked up to the top DVI port, through a DVI->VGA adaptor, is an HP A7217A (Sony Trinitron GDM-FW900 with HP branding).
fglrx correctly uses the monitor's optimal timings, and I have a 2304x1440 (120.5khz*80hz) login and desktop. This is nice. However, when I start up a fullscreen game, and tell the game to run in e.g. 640x480, the monitor still receives a 120.5khz*80hz signal - obviously wrong. The game is running at the small resolution, but the GPU is apparently scaling it before outputting the picture.
I have looked all over AMDCCCLE for an option to fix this. In fact, when I had my old Samsung Syncmaster 2494 hooked up via DVI, I remember there was an option about this. With the A7217A, though, there is no option - which would be fine, except it seems to default on and I cannot turn it off.
How can I forcibly disable GPU scaling and get proper low-frequency timings out of my Radeon?
On a related note, is there any way to expand the list of available resolutions? xrandr seems to choke on every new mode I try with fglrx. The open source drivers worked out of the box with a huge list from 320x200 up to 2304x1440. I could even set up all kinds of crazy custom timings. No luck with the proprietary drivers. (And no, I can't switch back to the open source drivers - I recently can't live without OpenGL 4.)
No ideas? It's painful not being able to use any resolutions besides 2304x1440 without artifacts.
Hi, I'm actually having the exact opposite problem - I have the same exact monitor and the automatic up-scaling suddenly stopped working. Now when I use a low resolution, instead of up-scaling, it just displays it in the middle of a 2304x1440 screen (with huge black borders). I actually liked the up-scaling feature myself because it allows me to play old low-res games without feeling like I'm looking at a pointillist painting! Anyways, I'm pretty sure the up-scaling is a feature of the monitor itself, rather than whatever GPU or drivers you're using. I always find that if I change the refresh rate after selecting a lower resolution, the monitor displays the actual resolution, instead of scaling. If you're playing a game which takes over the display, you might try using a utility like powerstrip to force a non-standard refresh rate, or at least something different from 80hz. Let me know if it works.