Can you make some screenshots or photos to see how things look without HiDPI support? I have no idea how it looks like - never seen it.
Without HiDPI support, it looks like any regular screen - that's the problem, that it treats a 4K screen as simply a very large monitor. But it's not actually a big monitor, it's a normal-sized one with a much high resolution, so if you don't do anything special, something 100 pixels in size ends up unexpectedly small.
HiDPI support effectively adds scaling factors, so that if the DPI is over a threshold, the title bar (for example) will be 80 pixels high, instead of 40. It's integer-multiples only, though, since that makes it easier to deal with non-scalable content like bitmap images - you can have 1x or 2x multipliers, but not 1.5x...
Looks like some sizes should be expressed in physical distances (actual inches in the screen, not pixels) or relative to screen size... just like CSS works.
You're, oh, about a decade late to this party.
That always sounds like the right fix, but it kinda isn't. There are several problems, which mainly boil down to:
1) Lots of displays lie to us.
That is, we don't accurately know how large quite a lot of physical screens are. Screens are supposed to provide this information in their EDID data, but some do not, or provide incorrect data. Sometimes they're a little bit out, sometimes they're a lot out. It's not practically possible to come up with smart enough heuristics or a big enough quirk database to cope with this.
2) People don't sit the same distance away from all screens.
This is one a lot of people miss, but 'how big something looks to the person looking at a screen' is a *three* factor equation, not a *two* factor equation:
i) Physical screen size
iii) Distance of viewer from screen
To give the most obvious example: it might sound reasonable that we should always try to render, say, 11pt text at the same physical size...but what if you're sitting on a couch looking at it on your TV across the room?
That's an extreme case, but there's one which is so common just about everyone does it: you don't sit the same distance away from your tablet, your laptop and your desktop monitor. If you've ever actually crunched the numbers and set a 'correct' DPI or scaling factor on your desktop and laptop, you might notice that everything looks too big on the laptop. This is because you usually sit rather closer to a laptop display than to a desktop display. You usually sit closer again to a tablet display. So you *don't* actually want things to be the same physical size on all three.
If you look in the real world, up until hidpi screens started coming online, there was actually some method to the apparent madness of display resolutions. Almost any desktop display you can buy is in the range of 90-110dpi. Many laptop displays are in a range of around 120-150dpi; if you compensate for physical distance, those two ranges are actually pretty similar in terms of 'perceived size'. It may seem nutty, but it makes sense.
The 'just quadruple everything' approach to hidpi might seem like a bit of hack, but it actually makes quite a bit of sense. For an OS, trying to get things 'just right' for arbitrary devices with arbitrary screen size/resolution combinations is a problem so hard it's basically unsolvable. If we make it so that most devices fit in one of two or three resolution 'buckets' - the 90-110dpi-equivalent bucket, 180-220dpi-equivalent bucket, 270-330dpi-equivalent bucket (approximately) - and the OS just has to figure out which of the three buckets the device in question falls into, it makes things a lot more manageable. Yet still difficult. Just think, if we have this much trouble getting things right for most/all displays just with three possible settings, how impossible it is to do it 'properly'.
OK, there's enough info for me to clearly understand the issue. Of course, I just wish display manufacturers would include the (correct) resolution and DPI value in their EDID. That would make it easy for software developers to offer decent UIs out of the box.