I think you guys don't see the trees because the forest is in the way. What I see is a red graph that shows me UNPLAYABLE framerates HALF THE TIME (under 30, a bit above 20 fps) and a green graph that shows me good PLAYABLE framerates 100% of the time (above 30). What is important when playing a game is not the maximum and average framerates, it's the MINUMUM that make it choppy and umplayable. I know it's an nVidia conducted test, but you should really take a hard look at the actual result and consider 'what if'.
It looks like GTX480 has double the peak compute shader performance compared to HD5870. This reminds me of the 8000 series paper launch, when every ATi fanboy cried wolf and the wolf actualy came and ate everything in it's path.
Sure availability of the GF100 will not be good at first, but ATi had 3 months of availability problems with it's 5000 series and STILL does with some select models, prices have gone up, most people ended up with an nVidia card last year if you look at sale numbers.
And the fact that GTX480 is not as fast as the dual GPU HD5970, is actually a good thing if you think really hard about it.