Page 4 of 4 FirstFirst ... 234
Results 31 to 39 of 39

Thread: Frame latency analysis on Doom 3

  1. #31
    Join Date
    Jun 2012
    Posts
    343

    Default

    Quote Originally Posted by Paradox Uncreated View Post
    Ofcourse it could be possible to do vsynced HZ, and cleverly arranging things, so that each vsync, buffer is delivered, and next frame calculated. One for the kernel-engineers. (Who have time.)
    That assumes the CPU/GPU can keep up, which in modern games they typically can't. Doom 3 IS a decade old after all; really no reason frame latency shouldn't be much above 10ms or so...[would be interesting to run a Windows comparison...]

    http://techreport.com/review/21516/i...e-benchmarking

    Frame latency is a better benchmarking tool then FPS, because FPS averages out the slow time periods, and minimum FPS can catch outliers while hiding the latency problem. "Microstutter" on multi-GPU conflgs, for instance, is QUITE noticeable, even as FPS reaches into the hundreds.

  2. #32

    Default

    Quote Originally Posted by gamerk2 View Post
    That assumes the CPU/GPU can keep up, which in modern games they typically can't. Doom 3 IS a decade old after all; really no reason frame latency shouldn't be much above 10ms or so...[would be interesting to run a Windows comparison...]

    http://techreport.com/review/21516/i...e-benchmarking

    Frame latency is a better benchmarking tool then FPS, because FPS averages out the slow time periods, and minimum FPS can catch outliers while hiding the latency problem. "Microstutter" on multi-GPU conflgs, for instance, is QUITE noticeable, even as FPS reaches into the hundreds.
    G(l)aymer2k chimes in and shows complete lack of understanding, and the incoherence of Guano. "That assumes.." He didn`t understand jitter in the other thread either. It`s ridicolous, it is a joke. Wherever these people work, avoid them like the pest.

    And then to go on to "frame latency" and "slow time periods". Never use these people as translators, to put it like that, because obviously it turns to shit in there.

  3. #33
    Join Date
    Jun 2012
    Posts
    343

    Default

    Quote Originally Posted by Paradox Uncreated View Post
    G(l)aymer2k chimes in and shows complete lack of understanding, and the incoherence of Guano. "That assumes.." He didn`t understand jitter in the other thread either. It`s ridicolous, it is a joke. Wherever these people work, avoid them like the pest.

    And then to go on to "frame latency" and "slow time periods". Never use these people as translators, to put it like that, because obviously it turns to shit in there.
    You continue to make the silly assumption that all forms of jitter are OS/kernel related. Games are more likely to suffer jitter due to H/W effects, rather then S/W.

  4. #34

    Default

    Quote Originally Posted by gamerk2 View Post
    You continue to make the silly assumption that all forms of jitter are OS/kernel related. Games are more likely to suffer jitter due to H/W effects, rather then S/W.
    You are obviously nuts. I guess I am just going to have to get used to all the nutters on the internutz.

  5. #35

    Default PS:

    I think I have sufficiently solved jitter now though. Doom 3 jitter is even lower with renice (-20) + my listed tweaks. So I feel there is little to improve. It is gliding silky smooth now. No frameloss, and timing jitter is so low, that I think it should be near impossible to see. So for my part, I don`t need any numbers, and they would need to be more verbose, than the option in doom 3 anyway. But try it. You wil see a big difference, and very enjoyable smooth frames. The trick with renice can also be used with webbrowser, to have less jitter on youtube videos, etc. Where I also recommend chromium, because it has the lowest jitter to begin with.

    Case solved!

    Peace Be With You.

  6. #36
    Join Date
    Aug 2008
    Posts
    99

    Default

    Quote Originally Posted by thofke View Post
    What would then be measured in the y-range?
    The Y axis would still be frame duration. You can plot the same exact data by using the timestamp of each frame instead of the frame number as the X coordinate for each data point.

    I believe that Doom 3 timedemos are frame-for-frame identical across machines, because delays happen always in the same frame. Moreover, timedemos always have the same frame lenght.
    Timedemos wouldn't need any change, but there are graphs on the linked Techreport article that show vastly different frame counts. Looking carefully on page 2 you can see the same pattern of spikes in different places on all four Radeon GPUs:



    Using the timestamp instead of frame number for the X coordinate would make spikes caused by game content line up, while spikes caused by the process getting interrupted would not line up.

  7. #37
    Join Date
    Aug 2008
    Posts
    99

    Default

    Quote Originally Posted by Paradox Uncreated View Post
    You are obviously nuts. I guess I am just going to have to get used to all the nutters on the internutz.
    It's obvious that deficiencies in the OS scheduler can cause just as much jitter as a hiccup in the GPU, but otherwise what gamer2k is saying makes sense to me.

    Can you define what you mean by jitter? I would define jitter as any time-varying variation between when a frame is expected to be displayed and when it is actually displayed (so, for example, a constant 33ms delay would not be jitter, but a delay that fluctuates between 0ms and 33ms would be jitter).

  8. #38
    Join Date
    May 2012
    Posts
    435

    Default

    Quote Originally Posted by Paradox Uncreated View Post
    I think I have sufficiently solved jitter now though. Doom 3 jitter is even lower with renice (-20) + my listed tweaks. So I feel there is little to improve. It is gliding silky smooth now. No frameloss, and timing jitter is so low, that I think it should be near impossible to see. So for my part, I don`t need any numbers, and they would need to be more verbose, than the option in doom 3 anyway. But try it. You wil see a big difference, and very enjoyable smooth frames. The trick with renice can also be used with webbrowser, to have less jitter on youtube videos, etc. Where I also recommend chromium, because it has the lowest jitter to begin with.

    Case solved!

    Peace Be With You.
    good, now you can stop posting nonsense

  9. #39
    Join Date
    Jun 2012
    Posts
    343

    Default

    Quote Originally Posted by unix_epoch View Post
    It's obvious that deficiencies in the OS scheduler can cause just as much jitter as a hiccup in the GPU, but otherwise what gamer2k is saying makes sense to me.

    Can you define what you mean by jitter? I would define jitter as any time-varying variation between when a frame is expected to be displayed and when it is actually displayed (so, for example, a constant 33ms delay would not be jitter, but a delay that fluctuates between 0ms and 33ms would be jitter).
    Thats more or less correct. Granted, a constant 33ms latency wouldn't exactly be smooth either (a frame would be created one cycle, repeated the next as the next frame isn't ready, then the next one displayed on the third cycle), but because the rate is constant, we say theres no jitter, but there remains a latency problem.

    Basically, for a GPU:

    Latency: The time it takes to create a frame
    Jitter: The measure of the latency difference between two frames

    You can have a very high latency with no jitter. You can also have a lot of jitter with very little latency (more noticable on 120Hz native displays).

    And again, I stress Doom3 really shouldn't be showing any significant latency/jitter anyways, considering you could max the thing with a now aged 8800 GTX...

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •