Page 9 of 17 FirstFirst ... 7891011 ... LastLast
Results 81 to 90 of 164

Thread: Benchmarks Of AMD's Newest Gallium3D Driver

  1. #81
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by Qaridarium View Post
    you are sure they use any fixed funktions of dx ?
    Fixed-function DX died with DX7. This solution uses DX9, which means HLSL.

    There are hundreds of HLSL-/GLSL-based raytracing implementations. You don't need OpenCL to make this happen.

  2. #82
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by BlackStar View Post
    Fixed-function DX died with DX7. This solution uses DX9, which means HLSL.

    There are hundreds of HLSL-/GLSL-based raytracing implementations. You don't need OpenCL to make this happen.
    openCL only needs to be better than HLSL-/GLSL --

  3. #83
    Join Date
    Nov 2007
    Posts
    1,024

    Default

    Quote Originally Posted by Qaridarium View Post
    and your talking about FPS with raytracing is just complete nonsence !
    Think about this for a minute.

    If you did not render whole frames, you end up with these "ant lines." So say you only render 1/3rd of a frame. Let's say that instead of tearing or incorrect pixels, we just end up with 1/3rd of a valid scene evenly distributed across the screen with the remaining pixels being old scene data. Now use this technology outside of the proof-of-concept demos and in real games like, say, Left 4 Dead.

    (Q) What happens when you move around at high speeds, looking left and right and jittering around firing guns, and almost every single pixel changes every single game update at around 60hz?
    (A) You end up with a completely unrecognizable mess of smeared color across your screen that results in a completely and utterly unplayable game.

    At some point in the future, when ray tracing is more than just the toy demos you've found on Youtube, the scenes will be rendered to an entire frame and displayed at once. Because they have to be. Because the alternative is not usable or playable technology, not remotely.

    Also, try googling for "ray tracing fps." The first 5 hits for me were papers written by the actual graphics hardware vendors about GPGPU ray tracers... and they most absolutely certainly beyond any doubt measure things in FPS. Because real, non-toy raytracers do not accept "ant lines" as an acceptable outcome of a render, period.

  4. #84
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by elanthis View Post
    Think about this for a minute.

    If you did not render whole frames, you end up with these "ant lines." So say you only render 1/3rd of a frame. Let's say that instead of tearing or incorrect pixels, we just end up with 1/3rd of a valid scene evenly distributed across the screen with the remaining pixels being old scene data. Now use this technology outside of the proof-of-concept demos and in real games like, say, Left 4 Dead.

    (Q) What happens when you move around at high speeds, looking left and right and jittering around firing guns, and almost every single pixel changes every single game update at around 60hz?
    (A) You end up with a completely unrecognizable mess of smeared color across your screen that results in a completely and utterly unplayable game.

    At some point in the future, when ray tracing is more than just the toy demos you've found on Youtube, the scenes will be rendered to an entire frame and displayed at once. Because they have to be. Because the alternative is not usable or playable technology, not remotely.

    Also, try googling for "ray tracing fps." The first 5 hits for me were papers written by the actual graphics hardware vendors about GPGPU ray tracers... and they most absolutely certainly beyond any doubt measure things in FPS. Because real, non-toy raytracers do not accept "ant lines" as an acceptable outcome of a render, period.
    tearing is not the same as Ant Noise

    there is no modern high skilled realtime raytracing engine without Ant Noice

    and no realtime raytracing engine renders full frames they only delifers RPS on the nativ monitor HZ frame rate.

    but Ant Noise does not mean viewable noise for humans

    you do not need to render 100% of a frame because a human can not see the difference on 90% to 100% or 80% to 100%

    in the most apps 50% is fine thats because on the second frame its 75%

    on 60fps means if an human see 30fps as a movie the human can not check the difference on 30 to 60fps in raytracing thats because the screen chance per pixel and do not have an deliffering time out per frame.

    you got impressiv graphic effects just because Ant Noise imitate an natural Uncertainty movement effect.

    ""ant lines." "

    there are no ant lines on raytracing its per pixel means you really can not watch the ants on an higher RPS rate

    "At some point in the future, when ray tracing is more than just the toy demos you've found on Youtube, the scenes will be rendered to an entire frame and displayed at once. Because they have to be. Because the alternative is not usable or playable technology, not remotely."

    thats so wrong any realtime raytracing engine works in an relativ way.

    Realtime Raytracing with openCL:

    http://www.youtube.com/watch?v=v1JS4wyGGy0

    http://www.youtube.com/watch?v=zxEsyukiRw4

  5. #85
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    http://www.youtube.com/watch?v=JT6Iyl35Wnc

    this video shows th noise ants very well.

  6. #86
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    a very good exampel openCL+bulledphysic does raytracing:

    http://www.youtube.com/watch?v=33rU1axSKhQ

  7. #87
    Join Date
    Aug 2010
    Location
    Denmark
    Posts
    142

    Default

    A thought has occurred to me a couple of times in the past weeks:

    After seeing what is possible wrt. automatic benchmarking - like this graph from Phoromatic - I've been thinking if this is possible too with graphics drivers?
    Something completely on line with the charts from the above link, but only a machine constantly pulling the newest git versions of r600c and r600g, compiling them and running benchmarks.

    So on the X-axis we would have the date, exactly as in the Phoromatic page, and the Y-axis would have the FPS count for a specific game, like Nexuiz, for both r600c, r600g, and fglrx.
    We could then see, very precisely, the performance gains that these two open drivers have - day by day.

    Is it just me or would that be extremely cool?

    To take it even further, each git commit in the driver code could be tied together with a benchmark, to allow the developers to see any performance gains or hits that a patch introduces (a la this), and perhaps help to hint at where the driver needs work in order to get more performance.

    Is there any reason why this isn't possible, and a custom, "hand-made" benchmark, like the one that is the subject of this thread, has to be performed?


    Quote Originally Posted by Qaridarium View Post
    a very good exampel openCL+bulledphysic does raytracing:

    http://www.youtube.com/watch?v=33rU1axSKhQ
    Cool video! Looks so real, despite of the simple textures etc.

  8. #88
    Join Date
    Aug 2008
    Location
    Finland
    Posts
    1,670

    Default

    Quote Originally Posted by runeks View Post
    Something completely on line with the charts from the above link, but only a machine constantly pulling the newest git versions of r600c and r600g, compiling them and running benchmarks.
    Why go for the kill when you can go for overkill: we could have a commit-by-commit benchmarking of r600c and r600g for commits that actually touch those drivers. This would also give away speed-related regressions pretty much immediately after they end up in the tree.

  9. #89
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by Qaridarium View Post
    openCL only needs to be better than HLSL-/GLSL --
    Eh, no. OpenCL has a different target audience than HLSL/GLSL. It is not a feasible replacement and it is not meant as one either.

  10. #90
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by BlackStar View Post
    Eh, no. OpenCL has a different target audience than HLSL/GLSL. It is not a feasible replacement and it is not meant as one either.
    i think raytracing was not the target of HLSL/GLSL

    openCL was much better for that

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •