Oh, let's also get to mentioning. Sure, Nvidia is indeed the performance king, but let's not forget that Nvidia's cards are massive comparatively to ATI, and use far more power.
I can't find the original article, but they refer to it
"Nvidia's GeForce GTX 295 with SLI-on-a-card is the most powerful single graphics card on the planet. With two attached GeForce GTX 275 cards that have been merged, the GeForce GTX 295 offers very notable gains over the Radeon HD 4870 X2 in the great majority of game titles. Even more impressive is that it does so while consuming less power than ATI's flagship card, which is no small feat."
On top of that, if you are going top of the line, again power and heat are a non-issue. It is like, if I buy a Ferrari, do I care if it gets 5 miles per gallon? No, I care it looks awsome and goes screaming fast. If I cared about gas milage, I would buy a Prius.
And again, ATI fans, I am not saying they are a bad company. I own 3 ATI machines my self. I even own a AMD machine on top of that. However, if you look beyond gaming Nvidia is putting a lot of focus on GPGPU. Also many of you cite DX 10.1 or DX 11. THese are just silly API standards microsoft puts out. These are standards that Microsoft decides should be in their next graphics API. In general, there is very little if not no inovation here because before standards are adpoted, they are already pressent in current graphics technology.
Let me explain:
1. Fog is cool so graphics companies introduce special fog hardware
2. Microsoft decides it is a good idea and introduces a standard in the API
3. The hardware simple responds to that API call/command (like a function call)
4. Bam, you now support the fog function
As long as you satisfy all the functions, you are golden. Very little... innovation.
Now take a second and read what CUDA has done and I think you will have a better appreciation of Nvidia. I have had MANY first hand experiances. One of my classmates was a intern for the State of California Water managment and he accellerated the ground water model from takeing hours to seconds. Here are some published CUDA infrom from the Nvidia site.
Again, I by no means hate ATI or AMD. I simply think they are not doing anything spectacular.