The D3D drivers are far more stable. In this case, the stability has a bit less to do with D3D being a better API and much more to do with D3D actually being used. Recall that Linux users still frequently run non-GL-accelerated desktops, use DDX drivers and EXA/RENDER to get their basic apps on screen. OS X uses an entirely Apple-written driver architecture. Very few Windows apps use OpenGL, and even most of the ones people think use OpenGL actually use D3D: all Windows implementations of WebGL run over ANGLE, many of the "big 3D content apps" have switched over to D3D on Windows, and the games that use OpenGL are almost entirely just simple little 2D indie games that don't do anything even remotely interesting with the GPU.
Granted, Microsoft also has tests for D3D and properly designed the driver model such that every vendor didn't have to reimplement all of D3D internally, while Khronos still doesn't even offer a test suite, much less a core OpenGL framework for the ISVs to build (and even if they did, at this point the ISVs have too much invested in their internal implementations, so a switch is unlikely to happen without some strong-arming).
The overlay path is guaranteed never to tear, whereas the blit method is classed as "best effort".
I'll fix quote of my message for you:You are fail:Now confirmed not by nVidia employees, but by nVidia properietary driver documentation. Thank you for this link.The overlay path is guaranteed never to tear, whereas the blit method is classed as "best effort".
The following conditions or system configurations will prevent usage of the overlay path:
- Overlay hardware already in use, e.g. by another VDPAU, GL, or X11 application, or by SDI output.
- Desktop rotation enabled on the given X screen.
- The presentation target window is redirected, due to a compositing manager actively running.
- The environment variable VDPAU_NVIDIA_NO_OVERLAY is set to a string representation of a non-zero integer.
- The driver determines that the performance requirements of overlay usage cannot be met by the current hardware configuration.
Err, dude... You said tear-free can not be. But the documentation says "best effort". That doesn't mean it can't be, it just means there's no guarantee. No guarantee != can not be. No guarantee = you might be tear-free, but you might not.
So the fail is 100% you.
Besides, the discussion was that the Nvidia devs supposedly said tear-free video in a composited environment can not be. Which is false.
Last edited by Gusar; 07-29-2012 at 10:46 AM.
special samples that expose issue) that doesn't mean there is no tearing.
You know, it's like high-frequency squeak (I hope it's right English term) defect in audio stream, or incorrect color space conversion in video - most of people doesn't notice such problems, but it doesn't make such problems doesn't exist.
Last edited by RussianNeuroMancer; 07-29-2012 at 11:53 AM.