What does TDP mean, Nvidia?

The use of the term TDP will inevitably lead to comparisons between the power consumption of Nvidia’s graphics cards and heat output of CPUs and GPUs. By defining TDP as an input– and, in fact, as the maximum power draw of an entire card, rather than just the GPU – Nvidia is making its product look comparably terrible. A 6-core LGA1366 Core i7-980X Extreme Edition has a TDP of 140W, likewise an Athlon II X6 1090T Black Edition, but with a ‘TDP’ of 255W Nvidia is making the GTX 580 1.5GB seem to be twice as hot or twice as power hungry as those CPUs.

And for what? Detractors and the flippant will say that all this power is merely being use to play some games that can run just as well on an Xbox – what a waste of valuable resources!

Read Full Story >>
The story is too old to be commented.
OpenGL2931d ago

The maximum power usage of the entire card should be listed, as that's what will actually matter to the user.

OpenGL2930d ago

One mistake the author makes is that he compares the Core i7 980X's power usage to a Geforce GTX 580 and claims the GTX 580 is less efficient. The GTX 580 is a 3 billion transistor chip compared to 1.17 billion transistors in the Core i7 980X. Also, comparing a CPU to a GPU doesn't make a lot of sense in the first place.

kornbeaner2930d ago

he doesn't compare them directly when it comes to TDP. He merely uses that as an example as to how the non-tech savvy would a number such as TDP. the article was written in a sort of defensive stance for Nvidia's TDP number. But in actually the number is still pretty high considering the new 6000 series AMD card seem to have reduced their TDP, and this card seems to suck more power then even the 5970 which is not good at all.