Page 4 of 8 FirstFirst ... 23456 ... LastLast
Results 31 to 40 of 72

Thread: Will AMD's XvBA Beat Out NVIDIA's VDPAU?

  1. #31

    Default

    Quote Originally Posted by Silent Storm View Post
    I didn't read all the messages but I've seen a "nVidia's headstart" argument. Unfortunately ATI had it first, literally years ago, in Radeon 8500 era. It was called VideoShaders and VideoSoap. Did anyone hear it? It's unlikely because nobody except ATI tech demos used it. It was a CPU independent video acceleration and post-processing pipeline and worked very well.
    I said that. I should have clarified myself more. What I meant was that Nvidia had a headstart when it came to hardware decoding on UNIX-like operating systems.

  2. #32
    Join Date
    Jun 2007
    Posts
    145

    Default

    OT:
    Am I the only one who read the blog entry and found that: "The Poulsbo (US15W) video driver may be Open Source'd by Q4 2009"??

    well that's even better!

  3. #33
    Join Date
    Aug 2007
    Posts
    6,673

    Default

    @bridgman

    When XvBA works so well why didn't ATI ship headers to use it? Also some programming examples would not hurt too...

  4. #34
    Join Date
    Jun 2006
    Posts
    24

    Default

    Quote Originally Posted by thefirstm View Post
    I said that. I should have clarified myself more. What I meant was that Nvidia had a headstart when it came to hardware decoding on UNIX-like operating systems.
    ATI had GPU acceleration before in Linux in 9600 and X1600 era but they dropped it to re-write it from scratch. It was not literally HW decoding but it accelerated most of the video and enabled 720P videos over Xv with reasonable CPUs like high end athlon XPs and first gen Athlon64s

    What you are right about is the adoption and marketing side of it, which by no means hinder the truth value of your argument. Even the technical side is different, the real situation is again reflected by your argument.

    I wanted to point that the fact of being better and being first doesn't always bring victory but instead of writing it directly, I injected the whole idea to the post.

  5. #35
    Join Date
    Jun 2009
    Location
    Paris
    Posts
    432

    Default

    Quote Originally Posted by deanjo View Post
    Take the one that is most mature and complete and run with it.
    That's exactly why I chose and now stick to VA API.

    Quote Originally Posted by deanjo View Post
    One thing the tests also did not reveal is if cpu's stayed in their lowest power state or not, if any frames were dropped. Without a timeline graph the results or at least a frame count and played frame count the results are very incomplete.
    The initial testing used the -frames 1000 option, but I changed the procedure when I noticed only the Poulsbo dropped a few frames with the Riddick video (and only for this video). Dynamic frequency scaling was also enabled so the CPUs were at their lowest frequency: 800 MHz for the Atom CPU, 1 GHz for the Phenom CPU. Then, I redid the test in "performance" mode to have interesting measures wrt. Xv.

    Quote Originally Posted by Kano View Post
    The Benchmarks are really funny. Did somebody notice that he used a "Mobility Radeon HD 4870 - 550 MHz, 1 GB" together with a "Phenom 8450"?
    Do you know about MXM-to-PCIe adapters? Yes, I also had tested a GTX 280M in the same box, but the results were not conclusive enough since the GPU was not running at its highest frequency. So, I didn't publish the figures, though you can get them in the tarball. They won't represent the real G92 core capabilities though.

  6. #36
    Join Date
    Sep 2007
    Posts
    122

    Default

    Quote Originally Posted by gbeauche View Post
    That's exactly why I chose and now stick to VA API.
    I'd rather consider VDPAU to be the most mature API. It is widely supported by applications, works well (on NVidia GPUs) and is very well-documented. Currently it's the only video decoding API which has an implementation that "just works" *now* and without major fuss.

    VA-API on the other hand is badly documented, if at all, and seems to be missing out some of the functionality provided by VDPAU.
    Last edited by greg; 07-07-2009 at 08:48 AM.

  7. #37
    Join Date
    Jun 2009
    Location
    Paris
    Posts
    432

    Default

    Quote Originally Posted by greg View Post
    I'd rather consider VDPAU to be the most mature API. It is widely supported by applications, works well (on NVidia GPUs, no idea about S3) and is very well-documented. Currently it's the only video decoding API which has an implementation that "just works" *now* and without major fuss.

    VA-API on the other hand is badly documented, if at all, and seems to be missing out some of the functionality provided by VDPAU.
    S3 doesn't use VDPAU.

    The question was about mature and complete. VA API is just that:
    - complete: supports more codecs and video encode acceleration
    - mature: well, it has been around for a long time, though implementations were not public. There are at least 4 (if not 5), "native" implementations, i.e. real drivers, not counting my bridges.

    Now, as I said, applications support is weaker due to initial lack of drivers, but it's as trivial to add as for VDPAU. So, this can change quite easily.

  8. #38
    Join Date
    Feb 2008
    Posts
    214

    Default

    GEM vs TTM

    XvBA vs X-Video vs VDPAU vs VA-API

    ...

    When will this API/subsystem nightmare end? Please make a unified API for hardware video decoding, this is a pain in the ass...

  9. #39
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,587

    Default

    Quote Originally Posted by gbeauche View Post
    S3 doesn't use VDPAU.
    Wrong.

    RELEASE HISTORY

    06/26/2009: Version 14.02.17
    - Bug Fixes
    - XRandR support
    - VDPAU support
    - KMS Support

    The S3 Graphics Accelerated Linux Driver Set support:

    * Linux Kernel 2.6.x
    * X.Org X11R7.x with H/W 2D acceleration through XAA or EXA
    * SAMM / MAMM / Xinerama with multiple display
    * DVI dual-link up to 2560x1600 resolution
    * 90/180/270 degree display rotation
    * H/W accelerated direct-rendering OpenGL 3.0 API
    * H/W accelerated indirect-rendering OpenGL 2.1 API
    * Composite Desktop with AIGLX / Compiz
    * Full featured RandR 1.2 function
    * Kernel mode setting with standalone module
    * Full H.264, VC-1, WMV9 and MPEG-2 VLD bitstream H/W decoding
    through VDPAU
    or VA-API driver

    This README describes how to install, configure, and use the S3 Graphics
    Accelerated Linux Driver Set.

    http://drivers.s3graphics.com/en/dow...N_Linux_EN.txt
    Last edited by deanjo; 07-07-2009 at 09:34 AM. Reason: Highlighted for the blind

  10. #40
    Join Date
    Apr 2009
    Posts
    565

    Default

    Quote Originally Posted by MostAwesomeDude View Post

    And no, I don't think anybody on the open-source side wants to do any more split video decoding backends. Let's just do everything on Gallium and be happy with it.

    ~ C.
    Apparently there was a SOC2009 idea for VDPAU via Gallium:
    http://xorg.freedesktop.org/wiki/SummerOfCodeIdeas

    I don't know if it materialized, but this is _really_ the way to go!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •