G
Guest
Guest
[citation][nom]Crashman[/nom]Who buys a $4k system to game at 1920? The 2560 results are there.[/citation]
I have to say it makes little sense to see 100+fps in a benchmark. Though it's a good comparison to our older systems which often can only play back such games at 1024x768 resolutions.
The sweet spot for benchmarks is any benchmark that would measure from:
- 20-25 fps avg (as in unplayable, or barely playable for games with pretty consistent fps like boxing games)
- 25-30 fps avg (as in playable but with some hiccups in games with variable framerates depending on the environment the graphics card is simulating; eg: on a plain, in a forest, with lots of explosions going on, or inside a corridor, on places with low moving items and light)
- 30-50 fps avg (enough graphics power to play most of games fluid).
You need a minimum fps of 20 in order for a game to have a fluid framerate. Anything above isn't viewable with the human eye anyways; and is lost in heat. Anything below 20fps is displaying in a laggy system, hiccups, or worst case (below 5fps) looks like a slideshow of pictures, and is totally unplayable.
So avg framerates of more than 35~45 are only good for reference to older systems. 25 extra fps really won't make a squad difference when you're running a game at 120fps anyways!
I think what matters most is is a game playable at certain quality settings and resolution...
Any result above avg 45fps, or min 25 fps should lead in a 'yes'.
It's not like the gaming industry is making endless games that will tax hardware. Many games that are made today are graphically well below the point of taxing a modern graphics card. So in essence only a few of the top games @ max res should be tested, to see if when I buy the card, I'd be able to play that game fluidly or not.
I have to say it makes little sense to see 100+fps in a benchmark. Though it's a good comparison to our older systems which often can only play back such games at 1024x768 resolutions.
The sweet spot for benchmarks is any benchmark that would measure from:
- 20-25 fps avg (as in unplayable, or barely playable for games with pretty consistent fps like boxing games)
- 25-30 fps avg (as in playable but with some hiccups in games with variable framerates depending on the environment the graphics card is simulating; eg: on a plain, in a forest, with lots of explosions going on, or inside a corridor, on places with low moving items and light)
- 30-50 fps avg (enough graphics power to play most of games fluid).
You need a minimum fps of 20 in order for a game to have a fluid framerate. Anything above isn't viewable with the human eye anyways; and is lost in heat. Anything below 20fps is displaying in a laggy system, hiccups, or worst case (below 5fps) looks like a slideshow of pictures, and is totally unplayable.
So avg framerates of more than 35~45 are only good for reference to older systems. 25 extra fps really won't make a squad difference when you're running a game at 120fps anyways!
I think what matters most is is a game playable at certain quality settings and resolution...
Any result above avg 45fps, or min 25 fps should lead in a 'yes'.
It's not like the gaming industry is making endless games that will tax hardware. Many games that are made today are graphically well below the point of taxing a modern graphics card. So in essence only a few of the top games @ max res should be tested, to see if when I buy the card, I'd be able to play that game fluidly or not.