Quote Originally Posted by chuckula View Post
To all of the people spreading FUD that this card is less powerful than a 5 year old desktop ATI part... GET THE FACTS. People here are comparing theoretical hand-wavy peak performance numbers for *single precision operations* that are never reached in real life to actual certified *real* performance numbers for actual benchmarks achieved in the MIC systems. There is a *WORLD* of difference and if you don't believe me, go read the TOP 500 list and see how many 5 year old AMD GPUs are in there if those parts are supposedly amazing... I can save you some time since the answer is 0.

Let me put it to you this way: The exact same benchmarks where Intel has already shown MIC working at over 1 Terraflop are the benchmarks where Nvidia is *projecting* that its *full GK110 part* will be when it is finally released. Basically, Nvidia's top-of-the-line next-generation 7+ Billion transistor monster will be in the same league as MIC, but require you to use the CUDA programming model to get the performance. MIC totally destroys any existing compute accelerator on the market, and as a huge bonus the programming model for MIC is light years ahead of having to use CUDA or whatever passes for OpenCL these days in AMD land. The MIC is a fully documented architecture that supplies SIMD instructions that are expanded from the existing AVX instructions already used in Intel & AMD CPUs.

MIC is a *vastly* more open architecture than anything from Nvidia or AMD. And don't even get me started with the likes of Quaridiot who act like the Messiah has returned when AMD releases incomplete and inaccurate docs for some of its cards long after they have been released for a couple of unpaid volunteers to decipher. MIC is a 100% open documented architecture and Intel has already released open source software for it in advance of its launch. This architecture will hopefully force Nvidia and AMD to *really* take Linux seriously instead of treating it like a second class citizen while trying to make huge $$$ on Linux-based HPC systems.
So if the "facts" come from intel, they're truth, if they come from anyone else, they're lies? Get your bullshit straight. Intel is competent at precisely one thing; faking benchmarks.

So if you want to get your FACTS, buy one and run some *REAL* code on it. See it fall on its face.