These tests posted above are just a few out of a slew of real world, real app, and synthetic tests. It doesnt mean anything, as this is just Nvidias marketing and wishful thniking.
My first guess is this a defective GTX470 that didnt get the "pass" for quality control the first time out much like the 5830 which were defective disabled 5870's. Marketing at its best as a lower end variant that will eventually be harder to find later on. This is also like the GTX275.
I'm just waiting for the DX-11 & PhysX & 3D Vision "8800GT-Price/Performance" Nvidia card. I'm not a fanboy but can't understand the hurry to switch to DX11 buying an ATI Card or the first Fermi ones- 3D Vision is an Nvidia thing /PhysX is an Nvidia thing and there are more titles using PhysX than DX11... I'll say that there are even more titles using PhysX than DX10 `cause there are no practically games in the market full DX10/11 coded and you're all talking of spending money on eye poppy FPS Vsync Off (Tearing). 3D monitors will cost always less than an eyefinity system and btw you just need to wait for the next Fermi gen `cause for playing 1200p 1080p 720p 30 FPS average games Pc nowadays and even next gen games every high segment card from the last two gens can outperform console settings. Also Nvidia Cuda is great. So if the question is price the answer is wait... ATI 5000 and Nvidia Fermi series card are terribly expensive card 'cause by now this cards are not required... just luxury products. Fermi is just the new 8800GTX. Maybe the thing will change next month with the Microsoft plans for Visual Studio 10 and updates on DirectX 11 DirectGPU...
Other pretty important thing to happen soon will be Optimus/Mobile Fermi based laptops and a lot of you don't give the importance to drivers that they deserve... and Nvidia is a lot better and faster improving, sometimes f_kin than ATI. Sometimes I just wonder if some people likes more the games or if the real game is FRAPS.
M. (spain- sorry about my english)
The benchmarks also have determined that the GTX 465 will actually require less than 9000 watts, and will produce less than 9000 degree temperatures, unlike the 480 and 470. It'll still be so power hungry that it requires a nuclear reactor or hydroelectric dam to power it, and it'll still be hot enough to melt the polar ice caps in 24 hours....BUT at least it'll be $100 cheaper, right?
@ meat81 . the single gpu gtx 480 eats more power than a 5970 radeon and it reached temps that nears 100 degrees celsius , a dual gpu version of the current gtx480 would be impossible to make without lowering clocks , trim some parts maybe , a big fat cooler and ofc nvidia's trademark a big fat price
[citation][nom]alexmihai[/nom]@ meat81 . the single gpu gtx 480 eats more power than a 5970 radeon and it reached temps that nears 100 degrees celsius , a dual gpu version of the current gtx480 would be impossible to make without lowering clocks , trim some parts maybe , a big fat cooler and ofc nvidia's trademark a big fat price[/citation]
And you know for a fact that Nvidia will not make a revised 2 GPU version? Nope, you do not. With ATI having the fastest single card solution I dont think nvidia will let that go. Its just a nature of business to try to out do your competition, aka pissing contest.