Page 1 of 2 12 LastLast
Results 1 to 10 of 15

Thread: Intel Implements CMS MSAA For Ivy Bridge Driver

  1. #1
    Join Date
    Jan 2007
    Posts
    14,788

    Default Intel Implements CMS MSAA For Ivy Bridge Driver

    Phoronix: Intel Implements CMS MSAA For Ivy Bridge Driver

    The latest noteworthy patch-set coming out of the Intel Open-Source Technology Center is Mesa support for CMS MSAA for Ivy Bridge hardware...

    http://www.phoronix.com/vr.php?view=MTEzNTA

  2. #2
    Join Date
    Jan 2009
    Posts
    464

    Default

    I'm happy to see these AA patches going into linux.
    I'm saddened to see it happening so close to the release of high-ppi displays.
    I'm happy we're going to see more high-ppi displays though.

    So overall.... YAY!!

    F

  3. #3
    Join Date
    Sep 2008
    Posts
    989

    Default

    Quote Originally Posted by russofris View Post
    I'm happy to see these AA patches going into linux.
    I'm saddened to see it happening so close to the release of high-ppi displays.
    I'm happy we're going to see more high-ppi displays though.

    So overall.... YAY!!

    F
    "High-DPI" displays will just make UIs tiny, unless they......... scale them back up to being larger (more px) which defeats the purpose. You can only squeeze so much stuff into a 15.6" space (or however much space you have). Now of course if you've got an 80" display then yeah a large resolution makes sense. But laptops aren't benefiting from this because people will just want their UI chrome and text scaled up.

  4. #4
    Join Date
    Jul 2010
    Posts
    593

    Default

    Quote Originally Posted by allquixotic View Post
    "High-DPI" displays will just make UIs tiny, unless they......... scale them back up to being larger (more px) which defeats the purpose
    How exactly? You get sharper fonts, images look better and round object look more round and so on. Altough you proparly don't play anything with Intel hardware at 2K resolution they would still look better due to more detail.

  5. #5
    Join Date
    Sep 2008
    Posts
    989

    Default

    Quote Originally Posted by Teho View Post
    How exactly? You get sharper fonts, images look better and round object look more round and so on. Altough you proparly don't play anything with Intel hardware at 2K resolution they would still look better due to more detail.
    Uh, because if you are scaling up the UI then the scaling process eliminates the detail or incorporates it into the surrounding pixels?

    If the source media is designed with detail in that range and just occupies more pixels, that's one thing -- but since existing graphics are designed for existing resolutions, Gnome isn't going to magically look better because you're using a 16000x12000 resolution.

    Example: say you have a 17" display that is currently 1600x1200 (using 4:3 because it's easier to calculate). Now increase the resolution to 3200x2400 but keep the screen size fixed at 17".

    If you had a window border occupying 24 px and text in that window border occupying 10 px, now that will look twice as small, so you will need a magnifying glass (or suffer from eye strain) to see the smaller features.

    The other option is to create, from scratch, the necessary graphics with greater details so that the window border occupies 48 px and the text occupies 20 px, in order to remain the same size. But information theory dictates that if you use the same fonts you used before, then there is no more detail (or "information") in 20 px worth of the font scaled up than there was in 10 px of the font. So you'd have to create a new higher-res font too, as well as any graphical features that comprise the UI.

    Since 99% of what we see on the screen is "existing content" (i.e. most web sites, fonts, UI elements, etc) we aren't going to benefit from this until many years have gone by where standard resolutions are much higher than they are today.

    It'll be a long time.

    In fact this has happened before. Imagine being one of the first people with a 1280x1024 display back in the early to mid 90s when standard resolutions were 640x480 and 800x600. You would be viewing WWW content designed for smaller screens which would look really weird (small, etc) and you would be playing DOS games that are also designed for small resolutions and all the bitmaps will either look tiny and inscrutable, or large and pixellated to you. Take your pick between two unhappy choices.

    The best possible graphics quality occurs when your content is designed (from scratch) for the exact same resolution that your screen is. That's why so many games (well, those that don't use vectorized graphics, which is >95% IMHO) will ship separate texture packs for each screen resolution, or at least have "low", "medium" and "high" detail textures. At low res screens, the low textures look great. At high res screens, the low res textures look like crap. And at low res screen with high textures, it looks almost the same as low textures.
    Last edited by allquixotic; 07-07-2012 at 06:43 PM.

  6. #6
    Join Date
    Jun 2012
    Posts
    17

    Default A Possible Solution

    Quote Originally Posted by allquixotic View Post
    The other option is to create, from scratch, the necessary graphics with greater details so that the window border occupies 48 px and the text occupies 20 px, in order to remain the same size. But information theory dictates that if you use the same fonts you used before, then there is no more detail (or "information") in 20 px worth of the font scaled up than there was in 10 px of the font. So you'd have to create a new higher-res font too, as well as any graphical features that comprise the UI.
    Although a good deal of UIs are designed this way on certain Linux distros, I believe some UIs like Aero (on Windows) and Aqua (on Mac OS X) currently apply some vectorization to their fonts and window borders, which I believe should solve such a problem...? I think this would be highly feasible under GNOME, in the future.

  7. #7
    Join Date
    Jul 2010
    Posts
    593

    Default

    That doesn't sound right to me. To my knowledge nothing in modern user interfaces are prerendered to certain resolution. Icons (svgs), fonts and user interfaces are scalable altough there are some problems here and there. So if you have lets say 2K resolution monitor then those icons and fonts are rendered at that resolution. So altough you use "bigger" fonts and icons in higher resolution monitors so that that they are readable they still have more detail (curves come to mind). Then again I really don't know much about the subject.

  8. #8
    Join Date
    Jun 2011
    Posts
    316

    Default

    Linux needs really high resolution displays.. English text might look ok, but the Unicode fonts bundled with linux distros, especially Chinese, still look absolutely terrible.. I've talked to somebody who works on open source / creative commons fonts for Linux and everytime I bring up the fact that Unicode fonts scale horribly because of the lack of proper font hinting on Linux unicode fonts, they start grumbling about
    "Apple / Microsoft font hinting patents",
    "intellectual property",
    "a lot of time and effort that nobody wants to do"
    "when we all have very high resolution displays, proper font hinting/scaling won't matter anymore"
    etc.etc...
    As an excuse as to why the Linux unicode fonts suck so bad... To this day, I still install Apple/Microsoft fonts to get some sexy Korean/Chinese/Japanese font scaling on low resolution displays. The Linux unicode fonts just end up being a blurry mess from anti-aliasing or a jaggy mess from pixelation, making the characters hard to read.

    I found a few Chinese linux distros that had absolutely amazing Chinese fonts... Sure enough, they stole the fonts from an installation of Microsoft Windows and were cramming it into their Linux distro and giving it away for free. That is pretty much standard practice for a Linux distro in China.

    If you use the Linux Unicode / Chinese fonts bundled with a typical distro, you *NEED* a 1080p resolution screen on a 15.6" inch display or your text is going to look like ass.

  9. #9

    Default

    Quote Originally Posted by allquixotic View Post
    Uh, because if you are scaling up the UI then the scaling process eliminates the detail or incorporates it into the surrounding pixels?
    As people have pointed out, at lot of UI stuff is SVG or some other form of vector graphics. Text is not bit-mapped either - it's drawn according to control points and hints - and as such will scale happily to higher resolutions and look crisper.

    If your UI *is* using bitmapped elements, that's *bad*.

  10. #10
    Join Date
    Jan 2009
    Posts
    1,401

    Default

    Quote Originally Posted by allquixotic View Post
    Uh, because if you are scaling up the UI then the scaling process eliminates the detail or incorporates it into the surrounding pixels?

    If the source media is designed with detail in that range and just occupies more pixels, that's one thing -- but since existing graphics are designed for existing resolutions, Gnome isn't going to magically look better because you're using a 16000x12000 resolution.

    Example: say you have a 17" display that is currently 1600x1200 (using 4:3 because it's easier to calculate). Now increase the resolution to 3200x2400 but keep the screen size fixed at 17".

    If you had a window border occupying 24 px and text in that window border occupying 10 px, now that will look twice as small, so you will need a magnifying glass (or suffer from eye strain) to see the smaller features.

    The other option is to create, from scratch, the necessary graphics with greater details so that the window border occupies 48 px and the text occupies 20 px, in order to remain the same size. But information theory dictates that if you use the same fonts you used before, then there is no more detail (or "information") in 20 px worth of the font scaled up than there was in 10 px of the font. So you'd have to create a new higher-res font too, as well as any graphical features that comprise the UI.

    Since 99% of what we see on the screen is "existing content" (i.e. most web sites, fonts, UI elements, etc) we aren't going to benefit from this until many years have gone by where standard resolutions are much higher than they are today.

    It'll be a long time.

    In fact this has happened before. Imagine being one of the first people with a 1280x1024 display back in the early to mid 90s when standard resolutions were 640x480 and 800x600. You would be viewing WWW content designed for smaller screens which would look really weird (small, etc) and you would be playing DOS g
    ames that are also designed for small resolutions and all the bitmaps will either look tiny and inscrutable, or large and pixellated to you. Take your pick between two unhappy choices.

    The best possible graphics quality occurs when your content is designed (from scratch) for the exact same resolution that your screen is. That's why so many games (well, those that don't use vectorized graphics, which is >95% IMHO) will ship separate texture packs for each screen resolution, or at least have "low", "medium" and "high" detail textures. At low res screens, the low textures look great. At high res screens, the low res textures look like crap. And at low res screen with high textures, it looks almost the same as low textures.
    Vector graphics don't have the problem you're speaking of.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •