What's missing from the current EDID info (or any other input to xorg) is any concept of viewing distance. Even a heuristic to maintain a minimum point size (character size in pixels, roughly) probably wouldn't do what you want since that would put a floor at minimum readability and you probably want "nice" readability.
So yeah, xorg would be the group to talk to, although "complain" is a bit strong since the server is doing the calculations correctly based on the info provided by the monitor (which is also reasonably correct). You will be asking for some kind of over-ride to say "pretend my projector didn't just tell you about the big screen".
Leaving a DisplaySize line in your Monitor section seems like a decent solution for now, so you might want to word your question "if xorg.conf is going away how will this be handled in the future, huh ? huh ?".
Yeah, but if you wanted Windows behaviour everywhere you would probably be using Windows
AFAIK Windows does not consider DPI when converting character sizes from points to pixels. This avoids the problem you are experiencing, but brings a number of *different* problems (eg fonts becoming unreadable on a laptop with a small, high resolution LCD).
Be aware that this issue is still being hotly debated on xorg lists, and that there are a number of conflicting views on what "correct" behaviour really is.
It seems to me that none of the current approaches deliver what a user really expects when they get a new, high res monitor, which IMO is a combination of "seeing more" and "clearer text". This implies that neither ignoring DPI (Windows) or calculating based on DPI (X) is correct and that the ideal behaviour would be some kind of pro-rated calculation where doubling the DPI might result in fonts which are maybe 1.5 times as big when measured in pixels -- smaller than before but still readable.