Page 1 of 2 12 LastLast
Results 1 to 10 of 16

Thread: Intel's UXA Will Not Be Merged Back Into EXA

  1. #1
    Join Date
    Jan 2007
    Posts
    15,653

    Default Intel's UXA Will Not Be Merged Back Into EXA

    Phoronix: Intel's UXA Will Not Be Merged Back Into EXA

    Back in August we talked about the UXA acceleration architecture, which was developed by Intel and based upon EXA but pixmaps were dealt with as GEM objects. A month later at the X Developers' Summit, Keith Packard had clarified his UXA work. At that time he said the information learned from UXA would work its way back into EXA by separating the pixmap management and acceleration components of EXA...

    http://www.phoronix.com/vr.php?view=NzA0NQ

  2. #2
    Join Date
    Nov 2007
    Location
    Die trolls, die!
    Posts
    525

    Default

    I don't think this is soo bad. It's a bit like the thing with radeon/radeonhd: The advantage of radeonhd is that it supports less hardware and so it is smaller, smarter and more compact. And if it's a new accel technology, why not. Even if it's in fact only an extended EXA. And even (or just because) if it's only useful for IGPs.

  3. #3
    Join Date
    Dec 2008
    Posts
    161

    Default

    I'd be curious as to the architectural reasons why, but I suppose we have to wait for the videos/minutes.

  4. #4
    Join Date
    Sep 2006
    Posts
    714

    Default

    I'd be curious as to the architectural reasons why, but I suppose we have to wait for the videos/minutes.
    Probably because there isn't any point in merging it back into the EXA drivers.

    Just like it was pointless to try to take the existing legacy EXA DDX stuff and extend it to create a functional equivalent to UXA, it's pointless to try to backport UXA into EXA legacy DDX.

    If UXA works, then why should you give a darn about sucking it into EXA? Just use UXA. Done. Get rid of EXA completely, throw it down the bit-bin of history... The EXA API was sort of designed, on purpose, to be driven by 3D graphics hardware and have a relatively short life as a 2D driven API. I think most X.org people understood that 2D hardware in Video cards is a dying thing... even modern cards with 2D hardware the 2D cores are essentially unchanged from when they were originally designed back in the Gefore 256 days (circa 1999).

    As far as programs go it shouldn't matter. Both implement the EXA API, so programs and end users shouldn't give a darn one way or the other. The only way it would matter to end users is if they experience a difference in performance and/or reliability.

  5. #5
    Join Date
    Jan 2008
    Posts
    145

    Default

    They never intended to help out other drivers by improving EXA incrementally. I could have told you this the moment Intel started putting
    Code:
    #if I830_USE_UXA
    into their driver. This showed they considered EXA and UXA distinct frameworks.

  6. #6
    Join Date
    Dec 2008
    Posts
    161

    Default

    Quote Originally Posted by drag View Post
    ...If UXA works, then why should you give a darn about sucking it into EXA? Just use UXA. Done...
    In it of itself, I don't. I don't advocate endless support of hardware (especially since the power/performance/energy cost ratios might suggest some is cheaper to replace), but there is plenty of capable hardware out there for general applications, and as I see plenty of these interesting new features being implemented I wonder how that older hardware could benefit. IE - would improved 2D performance on an older system mean my computer is relevant just a little longer, or that CPU performance can be put to more useful purpose. So when new APIs are implemented it tends to leave older drivers behind...

    I don't want to hinder progress, new features, and all of Intel's great work. I'm not even pretending to have a full understanding of the implications. I just was posing the question because I was curious, and trying to see the bigger picture.

  7. #7
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,574

    Default

    Quote Originally Posted by drag View Post
    If UXA works, then why should you give a darn about sucking it into EXA? Just use UXA. Done. Get rid of EXA completely, throw it down the bit-bin of history...
    I think the point is that UXA only works for integrated graphics processors where video memory = system memory. You would still need EXA for GPUs with dedicated video memory.

  8. #8
    Join Date
    Aug 2008
    Location
    Tokyo, Japan
    Posts
    39

    Default

    Quote Originally Posted by bugmenot View Post
    Even if it's in fact only an extended EXA.
    Actually, it's a stripped down EXA. The sad thing is that the main benefit of UXA (storing pixmaps in buffer objects) could already be done with EXA before UXA existed, and the parts they stripped from EXA would be inactive anyway in that case. There are academic problems with the EXA driver interfaces for this, but I think it would have been better to fix those than to create yet another acceleration architecture. Why they chose differently is anyone's guess.

  9. #9
    Join Date
    Jul 2008
    Location
    Greece
    Posts
    3,801

    Default

    So that Intel has a lead since their drivers now use UXA.

  10. #10
    Join Date
    Jul 2007
    Posts
    405

    Default

    As far as I understand, the only advantage of UXA is that IGPs don't have to make an extra copy. For discrete cards, it offers no real advantages that can't be done with EXA.

    Also, saying that 2d acceleration is on the way out is kind of silly. It may be true in the future that 2d will be treated simply as a subset of 3d, but that doesn't mean it will go away. There are still servers and other applications that don't need 3d, but do need fast 2d (for UI.)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •