PS: If you try +playdemo you will see that the demo is recorded with jitter. So it`s like watching a poor youtube stream. Thus useless for jitter benchmark. If a demo is to be used, it needs to be generated without jitter.
Most likely the jitter is in your brain. Why don't you record one and compare?
Why would you make such a non-constructive comment? If you do not have anything to add, please refrain from commenting.
Originally Posted by Wikipedia
Jitter is the undesired deviation from true periodicity of an assumed periodic signal in electronics and telecommunications, often in relation to a reference clock source. Jitter may be observed in characteristics such as the frequency of successive pulses, the signal amplitude, or phase of periodic signals.
If you look at the frame rate of a game as a periodic signal or as successive pulses from the previous definition, any deviation from that periodicity could be called jitter.
Well think about a game scene, why would it be possible that you can record jitter? timedemo plays the same content just in the max possible speed, when you disable vsync in the video driver (nv 300+ has vsync active by default) then you get higher rates than 60 fps. But when it is played at normal speed then he say it has jitter - not that i laugh. Even when you record the movement (something like that is in the demo) with less than 60 hz then it can only be a bit jumpy but never jitter as the same content can not be played with less than 60 fps when it would have got that speed. He is just weird and sees everywhere jitter where nothing is. It is definitely NOT the question if a frame takes too long to render.
If you look at the graphs from page 1, you see that there are frames which take more than 100ms to render. This is unwanted behaviour and corresponds to a lowly 10 FPS or less (!). All standard benchmark software masks this behaviour by the fact that the frames per second metric averages out those long frames. I suppose we agree on the fact that this method adds a lot of useful information about how smooth a game runs.
How would you call this new type of analysis then?
Basically it is nothing "new", lots of benchmark tests (on win) show the fps drawn on a time graph. Also you should not compare your system against his, because the gfx card is definitely different. When the card is fast (and only limited by driver) then other aspects like asset load time you have to take into account. Didn't you remember his remarks that load time could be tuned by a special kernel config - all that is more or less complete bullshit. Usually a 2nd run does not show these peaks when everything is pre-cached. It's also the same person who would buy a 2.x ghz quad core for dual socket 2011 instead of a 3.5 ghz quad with turbo for gaming - just to fix jitter - do you call that normal?
I do not see why this discussion should be about Paradox Uncreated. This thread is about measuring hiccups/judder/jitter (or how you'd call it) in doom3. My point is that this could be useful. I posted the comparison of my system with Paradox's to demonstrate the possibilities of this method, no more, no less.
I would rather have a system which can always churn out 30 fps minimum, instead of a system which does 60 fps average, but with hiccups of 500 ms or larger.
Guano hopes to be prime bitch the next ten years too. Out of respect for threadstarter I am just going to ignore the subcattle, and their worthless bloodstream.
I do think he makes a very good point of who he is though. Apparently even not the extreme jitter in the demo, is visible to him. He must obviously be blind. And seeing his argumentation in low-jitter kernel thread, not only mind, but lack a mind and senses aswell.
I have already posted the numbers I get also, and they show low-jitter.
Ofcourse it fits logically. The only enemy of sense, must be senselessness itself.
And he probably is frustrated from not recieving his daily cottaging. And rages in threads on phoronix, and probably osnews.
THE TROLL. Olaf in his barn. He doesn`t understand computers, and he doesn`t understand Islam. Just don`t let him fool you, because then you will live at the lowest level, as him.
It depends how you measure your 60 fps. If you measure it with vsync enabled in the gfx driver then it is unlikely that you will notice many drops while you play the game (ok, when it loads new game data, faster from ssd btw). If you get 60 fps average while the gfx driver (and the game setting) does not restrict the rendering then it more likely that there are many parts below 60 fps (that's the case with my gt630 @ 1920x1200). The switches in framerate you can see thats clear - but then you just need a faster gfx card usually to get rid of em. D3 is from 2004, so all cpus should be fast enough. As the code is opensource you can limit the HZ to 30 or 120 or whatever you like in neo/framework/UsercmdGen.h.
This is exactly what was said in the other tread too. There is no "loads new game data" framelosses here, and that is even without SSD. And as I said in the other thread, not even QUAD SLI will improve jitter. Actually it will worsen it.