Overall, whats the difference between 100fps and 150 fps, and can a human tell the difference between them? I wonder if Tom's has done an article on such a topic. I haven't seen any as of late. If It was a difference between 30fps and 60 fps, then yes, it's obvious enough. But after you go beyond around 80, it becomes harder and harder afterwards IMO, to know who is running what fps at a time, does it not?
The whole argument I've seen here, as high jacked by Nvidia fanboy (usually the same Intel Fanboys circling around to make AMD fan boy's life miserable...) about the 670 being better than the 7950, just IMO, it varies from person to person in what is seen and unseen, as much as what company did a superb job in improving on the reference card and who did a complete hack job all the same. It's nearly an Apples to Oranges comparison in my view, Nvidia does things that ATI doesn't do and vice versa. Some cards perform better than others, but comparing cards of different types, like the 7950 and the 670, it's akin to chasing a Rabbit or a Hare. There is a difference, you may not see it immediately, but it's there. What really matters is what YOU see and if you're satisfied with the product or not.