Results 1 to 10 of 19

Thread: OpenGL Frame Latency / Jitter Testing On Linux

Hybrid View

  1. #1
    Join Date
    Jan 2007
    Posts
    14,836

    Default OpenGL Frame Latency / Jitter Testing On Linux

    Phoronix: OpenGL Frame Latency / Jitter Testing On Linux

    Beyond there finally being Team Fortress 2 benchmarks on Linux, at Phoronix is now also support for OpenGL frame latency benchmarks! It's another much sought after feature and request for graphics hardware and driver testing...

    http://www.phoronix.com/vr.php?view=MTQxNDI

  2. #2
    Join Date
    Nov 2008
    Location
    Madison, WI, USA
    Posts
    874

    Default

    Nice to have this. This, combined with capturing an APITrace, could really make performance tuning graphics drivers easier.

  3. #3
    Join Date
    Oct 2008
    Posts
    3,137

    Default I have questions about how this works

    On the windows side, we've been told it's impossible to capture this information reliably without a piece of dedicated hardware that NVidia recently released to some review sites. This seems to be completely in software, though.

    Is this going to run into the same issues that windows games had? Is it the same less accurate tests that came out on windows a while back, that can show some obvious problems but not the real details of what's going on? Or is OpenGL somehow different than D3D in this regard?

  4. #4

    Default

    Quote Originally Posted by smitty3268 View Post
    On the windows side, we've been told it's impossible to capture this information reliably without a piece of dedicated hardware that NVidia recently released to some review sites. This seems to be completely in software, though.

    Is this going to run into the same issues that windows games had? Is it the same less accurate tests that came out on windows a while back, that can show some obvious problems but not the real details of what's going on? Or is OpenGL somehow different than D3D in this regard?
    It's what's reported by the engine. Some references:

    http://www.iddevnet.com/doom3/

    http://phoronix.com/forums/showthrea...ysis-on-Doom-3

  5. #5
    Join Date
    Aug 2011
    Posts
    71

    Thumbs up

    awesome, finally a really good measure how choppy the game feels. Really cool work there, Michael

  6. #6
    Join Date
    Oct 2008
    Posts
    3,137

    Default

    Quote Originally Posted by Michael View Post
    Hmm, the documentation doesn't really seem to say anything about how it calculates that number. I guess i'll need to dive into the source code if i really want to know. Most likely it does the same thing the old windows tests did, and just adds a callback event before the frame is displayed.

    As such, i think we need to take these tests with a grain of salt, although they are still very useful to see.

  7. #7
    Join Date
    Oct 2012
    Posts
    148

    Default

    Quote Originally Posted by smitty3268 View Post
    Hmm, the documentation doesn't really seem to say anything about how it calculates that number. I guess i'll need to dive into the source code if i really want to know. Most likely it does the same thing the old windows tests did, and just adds a callback event before the frame is displayed.

    As such, i think we need to take these tests with a grain of salt, although they are still very useful to see.
    Why? If it just calls gettimeofday() after every draw call and presents the difference from last frame, then it is showing frame latency. Sure, double or triple buffering will smooth things out for the user, but if you have peaks at the level of of 30ms, there's no way you won't notice it while gaming. So it does show how smooth the game is, and that's most important.

  8. #8
    Join Date
    Oct 2012
    Posts
    136

    Default

    Quote Originally Posted by smitty3268 View Post
    On the windows side, we've been told it's impossible to capture this information reliably without a piece of dedicated hardware that NVidia recently released to some review sites. This seems to be completely in software, though.

    Is this going to run into the same issues that windows games had? Is it the same less accurate tests that came out on windows a while back, that can show some obvious problems but not the real details of what's going on? Or is OpenGL somehow different than D3D in this regard?
    although windows related this video was interesting


  9. #9
    Join Date
    Nov 2010
    Posts
    90

    Default

    Quote Originally Posted by DDF420 View Post
    although windows related this video was interesting ...
    I'm subscribed to the PC Perspective podcast and follow Ryan Shrout's work on "Frame Rating". This describe the video as "windows related" is a bit disingenuous!! The problem of more accurately measuring the "full latency" to render each frame is just as pertinent to Linux/OpenGL Desktops as it is to Windows/DirectX Desktops. I would love to see some "frame-rating" done on games run on top of Wine for example.

    The difficulty in physically doing "frame-rating" analysis is not the dumb drawing of colour coded bars on the frame edges (for sorting frames being rendered into order) - the application Nvidia wrote has been opensourced - this code was subsequently picked up and added in as a feature into one of the many Windows only 3rd-party GPU utilities (sorry forgotten which one!!)...

    The real difficulty is acquiring the $1000's worth of HDMI capture card to get the video frames back for subsequent analysis!! No doubt with Windows-only drivers

    The post-processing calculations required are again relatively simple: detect coloured bars and write to a spreadsheet, etc. Then you can perform statistical analysis on the results...

    In general gamers are going to find frame stuttering far more annoying than a smoother lower FPS (say 30 vs. 60). Games played through Wine suffer from frame stuttering problems (unless your system is say 1-2 orders of magnitude greater than the equivalent Windows system required to play the game smoothly). Current Linux benchmark results do not accurately reflect this reality (not just picking on Phoronix here )...

    IMHO I can't really trust or rely on content that doesn't use this methodology... I've always felt Fraps-type measurements were a bit bogus...

    Ho hum... Just my $0.02

  10. #10
    Join Date
    Jun 2010
    Location
    ฿ 16LDJ6Hrd1oN3nCoFL7BypHSEYL84ca1JR
    Posts
    1,052

    Default

    Quote Originally Posted by bobwya View Post
    The real difficulty is acquiring the $1000's worth of HDMI capture card to get the video frames back for subsequent analysis!! No doubt with Windows-only drivers

    The post-processing calculations required are again relatively simple: detect coloured bars and write to a spreadsheet, etc. Then you can perform statistical analysis on the results..
    Why can't you use a normal video camera that can record at least 60 fps and just film a screen? Sure, you need a bit more computer vision methods for analyzing it, but it should be about as good. Sure, it may not sync exactly to screen refreshs, but it would always be the same delay, so it wouldn't matter, would it? (I don't really know how exactly screens update)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •