Quote Originally Posted by Pyre Vulpimorph View Post
The point of this thread, really, was so I can learn more about how modern games work. Specifically, what the CPU is left doing while the GPU is busy rendering frames. So, let's shoot for the moon and say my client's system will include a Radeon HD 7870 (pitcarin) GPU, and "normal" output resolution is going to be 1920x1080p.
I think the best thing you can do to learn how a modern game works is to make one. Failing that (time pressure is a good reason why that's not feasible), go talk to game developers. Encompass the full range - indie, AAA, 2D, 3D, web browser based, desktop based, console based, mobile based, the works. Look at multi vs single threading, and how engines differ with that. Look at what OpenGL does across the different platforms. Look at actual engines and profile them (ogre, irrlicht, id software stuff, if you can't get numbers from elsewhere). Look at the power requirements, heat dissipation, active or passive cooling, etc etc etc. Look at peripherals (how many controllers are attached, etc).
And also, talk to game developers about what they currently have, what problems they have to work around, and what they would like to have. Don't just look at things such as "virtualised textures", "cache control for data streaming", "z-buffer access", but also what languages to program in, if there's an OS to work with (or around), support, compilers, toolkits, the works. It's not just about the hardware, but software support too.