That is a number, to show that I am not interested in sacrificing performance after 0.2ms. I know some people would want to go lower, and talk about 5uS latency. That is not what this is about.
Originally Posted by GreatEmerald
At 0.2ms things are pretty much virtually hardware. Things feel as responsive as a vintage Amiga 500, and then you have that whole crowd. And that low-jitterlevels inspired a generation of multimedia artists, we really don`t see today. Whenever someone nostalgically mentions Amiga, it is mostly because it is a low-jitter machine. Non-interrupted animations, instant sound when key pressed. That pretty much raises the overall computing experience, to where quality software is more likely to happen. And I do believe that is why that happened on Amiga.
Ofcourse also arcade-machines without jitter, might be more a comparison today. For instance Virtuafighther etc. Well it`s been a long time ago since I played arcades, but Doom 3 feels more like a hardware arcade, than any dull highlevel-PC. Which also means that highlevel-programming has reached the point of virtually being similar to assembly-coded hardware, which I also find to be an interesting point.
Peace Be With You.
PS: I do like C and portability, but you know, some people are always going to make wild high-level contructs and emulate the universe, before sending text to screen.
Two situations here:
1) You have 30 FPS in game X, but frome time to time you see lags (usually when lots of things happen == most important parts of gameplay)
2) You have 25 FPS in game X, all smooth, this is from any given frame to next eclipse same amount of time.
Which one you choose?
Well that depend on type of game. Solitare .... But generally games that require immediate action based on sound and video benefit from 2) scenario.
That is why good (hardware) benchmarking sites depart from reporting FPS and present some mathematical derivations, like medians, avrgs, or min/max.
We aren`t even talking about 5fps here, and some benchmarks even show higher.
I can tell you a little background for this, for those who are amused by such stories.
I remember getting an Amiga 500, at the time it was becoming popular. Then I had already used c64 for quite some time. (Since I was a child of 6-7 years).
I remember playing this on c64.
It was running completely smooth all the time, with samples, etc.
With the c64 1mhz CPU.
The Amiga had 7mhz CPU, and a more customizable palette. Usually 32 colors. But games like Psygnosis Obliterator, would be really slow, and run choppy. Still it was a progress in technology, but still, I thought the c64 game, was cooler. I think at that time, I started thinking about suboptimal highlevel constructs, and jitter.
Ofcourse many later learned how to push the Amiga, and made it perform as smooth as a c64. Including Psygnosis. Even tetris games (twintris) was fun because of it.
Now most of us have more than 5ghz machines. Yet it has problems, performing as good. Yes we have good high level constructs as OpenGL now. And obviously I have made it run very good. However how can anyone accept framejitter, and a computing-experience worse, than a c64 at 1mhz. ;)
That is really what inspires my low-jitter configs, which btw are quite good these days.
Peace Be With You.
PS: Even games were made in basic on c64, that had no slowdowns. Imagine interpreted basic, on a 1mhz CPU, can do lower-jitter than a modern pc. (well on many PCs ;))
On PC, you might notice a similar point, when running a game in MSDOS, vs on windows, where it would perform poorer.
Peace Be With YOu.
On c64 I remember you had to do clever stuff with interrupts to do smooth scrolling. What has happened since then? Someone please update me!
Software blitting due to insanely high CPU speeds and video memory throughput :-P Started around the time of the 486. Ended again when 3D accelerators came along.
Originally Posted by Paradox Uncreated
I would really like to know which benchmarks you run to compare kernels. I just played a few levels using dhewm3 and a card below midrange (gt630 kepler) and still have got 60 fps with a kernel with u default config. I did not notice input lags or whatever. As doom3 is very rarely played online (4 players max with default game, 8 with addon), maybe you could say something about quake live - but there the gfx card does not need to be so fast and i doubt the kernel config is so extremely important. Right now default u config means 250 HZ setting, thats 4 times faster than a tft refreshrate. What changes do you do to your kernel?
But the benchmark does include a standard error. And, looking at the full results, it does show that the low jitter kernel usually reduces it, but the difference is very slight. On Prey it seems to make the largest difference of 0.06 FPS smoothing while eating 0.36 FPS. On the other hand, with Ultra quality Xonotic, it makes everything worse, both the framerate and smooth framerate.
Originally Posted by przemoli