Well it doesn't necessarily conradict itself. Even you point out about the minimum FPS argument.
Look at some of the HD series performance and you see almost a tunnel of performance in some games with no great highs, but also no great lows, while the cards it's being compared to has a higher average framerate and much higher max fps, but there is also a deeper trough of min fps. This is a similar situation when some people talk about SLi/Xfire and talk about little benifit, yet you look at an actual hystogram or per second benchie of the two and suddenly there's a distinct difference that doesn't show up much in the avg fps test.
A perfect example of this was one of those early Crysis benchies with the HD3870 vs GF8800GT;
http://iax-tech.com/video/3870/38704.htm
The average is higher, but look at the difference in the minimum fps, even on the retest on the next page it's 4X higher.
Now that still doesn't show if it's a very brief blip or a deep trough, but it starts to show how something can have lower FPS and yet still have smoother framerates. Consistent 50 fps is still smoother while on average being lower than than 20 fps - 150 fps - 30fps - 90fps - 15 fps - 120fps - etc.
And this has been seen in some situation with the HD2900 & GF8800 series, but it's usually the rarity versus the Ultra/GTX, but not as rare versus the GTS.