obsession over framerates

kurokaze

Distinguished
Mar 12, 2001
418
0
18,780
I fail to understand what the big deal is with getting an extra 10 frames/second in Q3 when your monitor is actually limiting what you see on the screen.

There are people who are obsessing with the fact that with an Ultra card you can get these incredible frame rates but when their monitor is at 1024x768 and 85Hz vsync you don't see anything above 85 fps. Even turning off vsync doesn't do anything since the game will draw in memory but the monitor can't keep up with it.

Will someone please enlighten me??
 
The framerates that we see in a benchmark are based on a average framerate. Since the framerate of a game is almost never consistent under many graphics cards, you can have strong framerate drops during the bench. For example, running Unreal Tournament on a i810 at 800x600x16 gave me a average fps of 26. This might look ok at first since it is almost playable at that frame speed. But since you know that that score is based on a average fps, you would then know that the framerate had to drop below the average score. In this case on the i810, it droped to about 17 fps during the bench but under real game play, I've seen it drop to around 5 fps under 800x600 D3D. So when we look for a graphics card, we tend look for something that renders at around average 60fps or better, in hope that the framerate never drops below 30-60fps.

=
<font color=green>Running Celery 400 since 1999.</font color=green>