dgingeri :
Really, it is unprecedented. I have always had my performance go up and up. there really is no way to get by that. The difference is that the top of the line, and in my case a couple steps down from top of the line, has increase in performance while decreasing power usage. I typically stick with 3 steps down from cutting edge with processors and one step down on video cards, at least since they became a deciding factor. (Back in the early days, there really wasn't much difference on video card performance. Processor meant everything.) This time, I actually got the top of the line video card, for the first time ever.
My Core i7 920 was faster than my older Core 2 E8400, but it used more power. The Athlon 64 X2 5600+ used less than the E8400. The previous Athlon 3700 used less power than the 5600. etc, all the way down to my 486SX 25MHz. The processor has always increased in power used as the performance increased.
With my video cards, it has been much the same. The two GTX470s used more power than the 4870X2. The 4870X2 used more power than the pair of 7800GTs I had before that. The 7800GTs, even one of them, used more power than the 6800GTS I had before that. All the way down to my old Cirrus Logic 512k that came with my original 486SX system.
Hard drives are the one part that has stayed pretty steady in power usage. They go up in performance and size, but have gone down a little in power usage.
I guess at the moment, I'm a bit off from my old habits, in that my processor is 4 steps down from cutting edge and my video card is top of the line, but in all I'm pretty close to normal for me. Yet, my performance gained significantly, more so than usual, and my power usage went down for the first time ever.
I could give you a whole run down on CPUs and video cards all the way back to my 486SX machine if you want. Hard drives would be a bit more difficult.
No, it's not unprecidented. Comparing a i7-920 to en E8400 is comparing a quad to a dual core, there's no way it wouldn't use more power if they're only one generation apart because that's how CPU differences are. Even the first Core 2 Quads didn't use less power than many of the Pentium Ds (dual core Netbursts) and that's one of the greatest differences between two generations in fairly recent CPU generations.
You're comparing apples to oranges to prove your point and that simply doesn't work unless you fix the comparison, but you didn't. Despite the two GTX 470s using more power than a single 4870X2, they have so much more performance that they still had higher performance per watt (they were more energy efficient). A GTX 580 uses more power than a GTX 470, but it's more energy efficient nonetheless because the performance difference between the two (favoring the 580) is greater than the power usage difference.
Your power usage went down despite getting the top of the line card now because the top of the line card uses only just under 200w of power. It is apples to oranges to the 4870X2 and dual GTX 470s because it is a single GPU compared to previous generation dual GPUs. You would get the same result comparing a further generation back, the 3670X2, to the GTX 560 TI instead of the 4870X2 to the GTX 680. The 680 probably won't even be the top of the line Nvidia card unless AMD allows it. If AMD releases a better card (they have one in the making, now we need to find out if it gets made), then Nvidia will release a monstrous GTX 685 or 680 TI that easily passes up the 680 and has an ~250w TDP to equal the past top single GPU cards.
Here's some simple ways to show what I mean. The GTX 295 is roughly equivalent to the Radeon 4870X2 and both are dual GPU cards from about the same time, they have the same process node too, that's an apples to apples comparison there. The GTX 480 is the next single GPU flagship card from Nvidia after those two dual GPU cards. It not only uses less power than them, but it gives about the same performance. A single generation later, we have the GTX 580 that uses less power than the 480 despite having a fairly large performance advantage over it. The 480 to the 580, however, are still an apples to apples comparison because not only do they use the same process node, they even use the same architecture, the 580 was simply a more optimized, bug fixed version.
Here, we have the 295 being beaten by the 580 in both power usage AND performance and there is a two generation delta between them. Now, you compare the GTX 470 SLI setup with the GTX 680. The 680 is, like the 580 and 295 comparison, two generations ahead, uses less power, and is higher performing. I don't know about you, but I see a trend here. The 680 actually under-performs it's assumed target based on this data because it technically should have performed as good as the GTX 590 and it doesn't for some easily explained reasons. I can go into that to if you ask me to.
Looking back on the AMD side, the next generation after the 4000s was the 5000s and their top single GPU card was just slightly behind the 4870X2 in performance, but used a lot less power. It was close enough that a simple overclock would fix this and it would still use a lot less power than the 4870X2. The next top single GPU card from AMD, the 6970, does match the 4870X2 and still uses far less power. This is all fairly simple math and paying attention. No, none of this is unprecedented unless you didn't keep up with video cards over the years. First, the 4870X2 and GTX 470 obviously were made for different markets (the 4870X2 is a higher end card despite it being older) and a one generation difference isn't enough to change that. It should come as no surprise that the 470 was more power efficient because it had a newer process node and architecture, but the two cards are still for different markets.
Quite frankly, if you could afford a 4870X2 back then, I'm surprised that you didn't replace it with a GTX 480 instead of a 470. The power usage would have dropped, but performance would not have.
Historically, power usage also increased on the higher end models of processors and video cards, but power efficiency also increased so the lower end models did increased in performance instead of power usage. However, there has been a reversal in that trend ever since Core 2 for Intel CPUs (AMD has remained about the same in power usage with each generation for a while now; 95w for low/mid end and 125w for high end) and about the same time for video cards. Some video cards still use more and more power, but the vast majority of cards are using less and less. The GTX 580 uses less than the 480, the Radeon 7970 uses less than the Radeon 6970, the GTX 680 uses less than the GTX 580, etc.
I said this in my last post and I'[ll say it again. Core 2's top TDP for it's fastest quad core CPUs :150w. Nehalem's top TDP for it's fastest quad core CPUs: 125w. Sandy Bridge's highest TDP for it's fastest quad core CPUs: 95w. Ivy Bridge's highest TDP for it's fastest quad core CPUs: 77w. Obviously, power usage is going down as performance increases here as of the last several years.
GTX 480: 250w. GTX 580: 244w. GTX 680: 195w. It's happening in graphics cards too. Besides that, even as TDPs increased in the past, the performance per watt increased more than power usage did so power efficiency increased regardless.
All of your examples are apples to oranges comparisons or highly outdated and thus not so relevant because they don't reflect modern technological trends in the industries.