Announcement

Collapse
No announcement yet.

GLAMOR & xf86-video-modesetting Get Deep Color Support In X.Org Server 1.20

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GLAMOR & xf86-video-modesetting Get Deep Color Support In X.Org Server 1.20

    Phoronix: GLAMOR & xf86-video-modesetting Get Deep Color Support In X.Org Server 1.20

    Independent developer Mario Kleiner has spent the past several months working on plumbing the Linux graphics stack for better "deep color" or 30-bit color depth support. His latest work on the X.Org Server has now been merged to mainline...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    10 bits certainly ough to be enough to anybody.

    I wonder if plans are made to make the architecture modular enough to accommodate even wider gamuts when they appear. Or is it not worth the effort (the new standard might be incompatible with the prep work, the software discontinued in between, etc.)

    Comment


    • #3
      Originally posted by M@yeulC View Post
      10 bits certainly ough to be enough to anybody.
      Actually, that is kind of true. When the research for the now popular electro-optical transfer functions for HDR was done, experiments were made on how many distinct brightness values humans could distinguish in a picture, and 10 bits turned out to be pretty much "enough for everyone", when spaced cleverly (not equidistantly) between 0 and 10000 cd/m^2 luminosity.
      Sure, one can always ask for more, if only to leave some headroom for future displays that could possibly display more extremes, but 10 bits sounds like a very reasonable sampling (for displays - for cameras or intermediate formats you of course want more to allow for processing).

      Comment


      • #4
        I don't understand why, when the human eye is more sensitive to green light, they don't split it 10 Red - 12 Green - 10 Blue, the same way they split 16-bit depths up as 5 Red - 6 Green - 5 Blue. What is the point of discarding 2 bits? They can't use that few for alpha...

        Comment


        • #5
          Originally posted by linuxgeex View Post
          I don't understand why, when the human eye is more sensitive to green light, they don't split it 10 Red - 12 Green - 10 Blue, the same way they split 16-bit depths up as 5 Red - 6 Green - 5 Blue. What is the point of discarding 2 bits? They can't use that few for alpha...
          With regards to bandwidth requirements on HDMI or DP connections every additional bit adds some - it's not as if HDMI was wasting 2 out of 32 bits on the wire.

          And memory size/bandwidth in frame buffers on a GPU are probably not that much of a concern anymore than they were when 16-bit-visuals were popular.

          Comment


          • #6
            Many standards already support 48bit colors. 30 bits isn't enough for human eye. 140 dB DACs for sound and 48b for the eye = better.

            Comment


            • #7
              Originally posted by dwagner View Post
              With regards to bandwidth requirements on HDMI or DP connections every additional bit adds some - it's not as if HDMI was wasting 2 out of 32 bits on the wire.

              And memory size/bandwidth in frame buffers on a GPU are probably not that much of a concern anymore than they were when 16-bit-visuals were popular.
              Thanks, yeah I forgot that with display resolutions ramping up the standards bodies are actually being conservative with bandwidth allocations, and that would lead to adopting a less-than-optimal use of resources.

              As for on GPUs though, the exact opposite is true. The reason for the GPU shortage isn't crypto-mining. It's a shortage of RAM, so wasting RAM is the last thing the display manufacturers want to do.

              Comment


              • #8
                Scientists always underestimate the capability of human perception. Refresh rates, pixel density, etc. Even today you have scientists claiming 20 fps is the most people can see. It's not a joke. Adrien Chopin for example said “It’s clear from the literature that you cannot see anything more than 20 Hz”.

                Meanwhile, others find that even at 500 fps artefacts are visible.

                The difference between 8-bit and 10-bit is increadible, and since I made the switch it's been painful to go back to 8-bit, on the same screen. So yes, GLAMOR supporting Depth 30 with Xorg >= 1.19.99.1 is great news, and about time.

                I personally can't wait for 12-bit and 14-bit to become mainstream. Even for those that can't tell the difference, I bet they would say they prefer the higher bit image every time, even if they couldn't say why.

                The comfort from higher sensory input quality is too often ignored in our culture. We often put up with whatever level is just low enough to not cause complete revolt. Like movies at 24 fps, or gaming at 30 fps. Some of these are caused by technological limitations, but most, say typesetting, is caused by tolerating mediocrity in our daily lives.

                Comment

                Working...
                X