Page 29 of 41 FirstFirst ... 19272829303139 ... LastLast
Results 281 to 290 of 406

Thread: ATI R600/700 OSS 3D Driver Reaches Gears Milestone

  1. #281
    Join Date
    Aug 2008
    Location
    Finland
    Posts
    1,577

    Default

    Quote Originally Posted by lucky_ View Post
    a software rendered opengl backend would be even faster than X11/xrender.
    You mean software-rendered OpenGL backend would be faster than a hardware-rendered XRender? Sounds like an interesting claim.
    Edit: The important bit here seems to be to analyze which of them spends more time in software-fallback state. Also another important thing is why it does that. Does QT possibly use excess features whose dropping would reduce the time spent in software-fallback state but have a negligible impact on the user-friendliness of the UI? This is what I meant by bloat.
    Last edited by nanonyme; 08-17-2009 at 08:01 AM.

  2. #282
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,126

    Default

    Quote Originally Posted by nanonyme View Post
    You mean software-rendered OpenGL backend would be faster than a hardware-rendered XRender? Sounds like an interesting claim.
    Edit: The important bit here seems to be to analyze which of them spends more time in software-fallback state. Also another important thing is why it does that. Does QT possibly use excess features whose dropping would reduce the time spent in software-fallback state but have a negligible impact on the user-friendliness of the UI? This is what I meant by bloat.
    Note that a fallback on XRender involves a VRAM->RAM download (extremely slow), software rendering and a final RAM->VRAM upload - add to that the overhead of X11 and it's not inconceivable that software OpenGL can be faster than software XRender.

    Obviously, hardware XRender will be faster than software OpenGL, but the point is optimizing the worst case. Besides, hardware OpenGL is likely to be faster than XRender on modern hardware, which lacks a 2d engine.

    I wonder if OpenGL can be used to add cross-platform hardware acceleration to Qt. Right now, the developers have to test and optimize at least 3 different codepaths (GDI, XRender, Quartz), not to mention whatever stuff mobile devices ship with. Using OpenGL and OpenGL|ES, on the other hand, would allow acceleration on most modern devices - sounds a worthy goal.

  3. #283
    Join Date
    Dec 2007
    Posts
    91

    Default

    Quote Originally Posted by nanonyme View Post
    You mean software-rendered OpenGL backend would be faster than a hardware-rendered XRender? Sounds like an interesting claim.
    Actually since not every driver provides the same set of features, QT when "forced to use" some of them will fallback to its raster engine because it won't trust the underlying platform. This is were the penalty gets big because of the vram to ram transfert which is a huge pain.
    And according to zacks and other qt dev's blogs, there is a gap between the theorically efficient Xrender and its availability.
    Xrender ends up being slower than their pure software engine, thus assuming that opengl even running only in software can be quicker than Xrender, is not too far fetched.

  4. #284
    Join Date
    Jan 2009
    Posts
    117

    Default

    Pure software 3D engine or software OpenGL rendering: Which one is faster? I give you one guess

    Same applies to any framework that provides abstracted hardware acceleration. There is always overhead involved handling the API. So if you write it all pure software with tight coupling you can beat any. Downside is a lot larger code which is a lot worse than a bit slower performance.

  5. #285
    Join Date
    Aug 2008
    Location
    Finland
    Posts
    1,577

    Default

    Quote Originally Posted by lucky_ View Post
    This is were the penalty gets big because of the vram to ram transfert which is a huge pain.
    And according to zacks and other qt dev's blogs, there is a gap between the theorically efficient Xrender and its availability.
    I was under the impression the fglrx was the only driver that lacks an efficient XRender implementation. *shrug*

  6. #286
    Join Date
    Dec 2007
    Posts
    91

    Default

    Quote Originally Posted by nanonyme View Post
    I was under the impression the fglrx was the only driver that lacks an efficient XRender implementation. *shrug*
    Well maybe you're right, but according to this post,
    http://zrusin.blogspot.com/2009/08/2d-in-kde.html.

    It seems that going the X11 way is the worst path you can follow. Hence the discussion I put forward.
    Maybe he only considered the open source stack where for sure the are many differences between the feature sset of drivers.

  7. #287
    Join Date
    Jan 2009
    Location
    UK
    Posts
    331

    Default

    Quote Originally Posted by nanonyme View Post
    I was under the impression the fglrx was the only driver that lacks an efficient XRender implementation. *shrug*
    Have people already forgotten the nVidia disaster with KDE 4.0? Users with $600 of graphics hardware were getting a joke framerate because the driver sucked so bad at simple 2D.

  8. #288
    Join Date
    Jun 2009
    Posts
    2,926

    Default

    Quote Originally Posted by Ant P. View Post
    Have people already forgotten the nVidia disaster with KDE 4.0? Users with $600 of graphics hardware were getting a joke framerate because the driver sucked so bad at simple 2D.
    I didn't forget it -- especially since the support for my chipset was discontinued so my laptop didn't profit from all the patches nvidia made to fix the problems.

  9. #289
    Join Date
    Jul 2008
    Location
    Greece
    Posts
    3,777

    Default

    NVidia: 1 or 2 fsck ups.
    AMD: Dozens and counting.

  10. #290
    Join Date
    Jul 2008
    Posts
    1,718

    Default

    nvidia fucked up so hard I switched over to AMD.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •