That is a number, to show that I am not interested in sacrificing performance after 0.2ms. I know some people would want to go lower, and talk about 5uS latency. That is not what this is about.
At 0.2ms things are pretty much virtually hardware. Things feel as responsive as a vintage Amiga 500, and then you have that whole crowd. And that low-jitterlevels inspired a generation of multimedia artists, we really don`t see today. Whenever someone nostalgically mentions Amiga, it is mostly because it is a low-jitter machine. Non-interrupted animations, instant sound when key pressed. That pretty much raises the overall computing experience, to where quality software is more likely to happen. And I do believe that is why that happened on Amiga.
Ofcourse also arcade-machines without jitter, might be more a comparison today. For instance Virtuafighther etc. Well it`s been a long time ago since I played arcades, but Doom 3 feels more like a hardware arcade, than any dull highlevel-PC. Which also means that highlevel-programming has reached the point of virtually being similar to assembly-coded hardware, which I also find to be an interesting point.
It was running completely smooth all the time, with samples, etc.
With the c64 1mhz CPU.
The Amiga had 7mhz CPU, and a more customizable palette. Usually 32 colors. But games like Psygnosis Obliterator, would be really slow, and run choppy. Still it was a progress in technology, but still, I thought the c64 game, was cooler. I think at that time, I started thinking about suboptimal highlevel constructs, and jitter.
Ofcourse many later learned how to push the Amiga, and made it perform as smooth as a c64. Including Psygnosis. Even tetris games (twintris) was fun because of it.
Now most of us have more than 5ghz machines. Yet it has problems, performing as good. Yes we have good high level constructs as OpenGL now. And obviously I have made it run very good. However how can anyone accept framejitter, and a computing-experience worse, than a c64 at 1mhz.
That is really what inspires my low-jitter configs, which btw are quite good these days.
I would really like to know which benchmarks you run to compare kernels. I just played a few levels using dhewm3 and a card below midrange (gt630 kepler) and still have got 60 fps with a kernel with u default config. I did not notice input lags or whatever. As doom3 is very rarely played online (4 players max with default game, 8 with addon), maybe you could say something about quake live - but there the gfx card does not need to be so fast and i doubt the kernel config is so extremely important. Right now default u config means 250 HZ setting, thats 4 times faster than a tft refreshrate. What changes do you do to your kernel?
1) You have 30 FPS in game X, but frome time to time you see lags (usually when lots of things happen == most important parts of gameplay)
2) You have 25 FPS in game X, all smooth, this is from any given frame to next eclipse same amount of time.
Which one you choose?
Well that depend on type of game. Solitare .... But generally games that require immediate action based on sound and video benefit from 2) scenario.
That is why good (hardware) benchmarking sites depart from reporting FPS and present some mathematical derivations, like medians, avrgs, or min/max.
But the benchmark does include a standard error. And, looking at the full results, it does show that the low jitter kernel usually reduces it, but the difference is very slight. On Prey it seems to make the largest difference of 0.06 FPS smoothing while eating 0.36 FPS. On the other hand, with Ultra quality Xonotic, it makes everything worse, both the framerate and smooth framerate.