FPS vs how fast a human eye see's

cdpage

Distinguished
Aug 4, 2001
789
0
18,990
if the human eye can only see so fast, like 24 fps, or saomthing to that effect, someone can prove me wrong here if they like, then what is the point in having games run at 100fps? i think i understand why hrz over 90 is better for you, as low frequencies will cause head aches.

what applies for fps? or is this the same pricable?



ASUS P4S8X - P4 2.4B - 2 x 512M DDR333 - ATI 9500 Pro(Sapphire) - WD 80G HD (8M Buffer) - SAMSUNG SV0844D 8G HD - LG 16X DVD - Yamaha F1 CDRW - Iomega Zip 250 int.
 

abitkt7araid

Distinguished
Oct 6, 2003
8
0
18,510
For CRT monitors I think higher FPS is better because even though we can't distinguish more than about 24 fps, anything lower than 70 or 80 does cause headaches like you mentioned. However, with the technology of a flat panel, 45 is sufficient becuase it takes a while for the color to fade from each pixel, if you get what I'm saying. Which is another reason flat panels are notorious for motion blur.

It's not how fast your computer is, it's how much you get done on it.
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
I have a feeling y'all are confusing refresh rate with framerate. WHile both pertain to vision, they are different, but related concepts.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

cleeve

Illustrious
A couple factors come into play here:

1. From what I understand, the human eye can see differences up to 70 fps.
24 fps is fine for film because movie cameras capture "blur", which smears the moving object and makes it seem like smooth motion when seen at 24 FPS.
If you watch a video game at 24fps, it looks much worse than film, because the moving objects are not appropriately blured

2. The most important factor is that most benchmarks post AVERAGE framerates without showing MINIMUM framerates. for example:

You finish the "X2: the Threat" benchmark and see that you have a 45fps score. Well, that looks dandy.
But in REALITY, the framerate has dropped to 12fps in some parts, and peaked at 80fps in others.

12fps is VERY noticably choppy. But you'd never know that the system would deliver 12fps from this video card/system combo because the benchmark showed it as 45fps and you naturally assume the framerate to be around there for the whole game. But this never happens, because as more enemies/AI/effects/geometry are used, framerates change. Every scene/room in the game will show different performance.

Because of this reason, hardcore first-person-shooter junkies consider an average framerate of 60fps or higher to be the bottom end of "playable", because actual framerates may lower alot from there.

Hope this helped,


------------------
Radeon 9500 (modded to PRO w/8 pixel pipelines)
AMD AthlonXP 2400+ (O/C to 2600+ with 143 fsb)
3dMark03: 3586
 

abitkt7araid

Distinguished
Oct 6, 2003
8
0
18,510
ok, but if you have a monitor that runs at 45 hz maximum, and your graphics cards is drawing at 100, you won't see 100, you'll see whatever the refresh rate is of your monitor. Am I wrong?

It's not how fast your computer is, it's how much you get done on it.
 

Ion

Distinguished
Feb 18, 2003
379
0
18,780
Thats true if you have CRT monitor and turned V-sync on, but if i remember right the flat panel work differently.
 

abitkt7araid

Distinguished
Oct 6, 2003
8
0
18,510
I think it's true for both monitors. Regardless of whether or not Vsync is turned on or ff, your FPS will never show higher than your refresh rate.

It's not how fast your computer is, it's how much you get done on it.
 

Willamette_sucks

Distinguished
Feb 24, 2002
1,940
0
19,780
If you have vsync off and are getting a higher refresh rate than what your monitor is refreshing at, your screen gets divided horizontally, so like the top 3rd will be the top 3rd of a given frame, middle third, next frame, bottom third, next frame, get it? This is called tearing (parts of different images being displayed at the same time).

And BTW your monitor does not run at 45hz, get a clue.


Stop asking people "whats up" or "hows it goin", you know you don't give a sh*t. And noone cares how you're doing either.

"We are far less than we knew." - Bright Eyes
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
How can 60FPS be the bottom end minumum? I've beaten entire games on maybe a third of that framerate, like Half-Life. I consider 30 or 40FPS to be pretty playable. Those instances when the framerate drops that low are extremely rare. I've looked back to the Benchmarks of the Old "Voodoo" days, and the new games the Voodo would score in the 30s or 40s and that was considered awesome back then. In fact, the Voodoo is the chip that revolutionized the 3D graphics industry & it wouldn't qualify as "bottom end playable" and people were gocking at amazement of its outstanding visual quality and uber-fast framerates that totally blew the standard 2D PCI cards out of the water.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

Willamette_sucks

Distinguished
Feb 24, 2002
1,940
0
19,780
Hes talking about the AVERAGE! If you're pulling 30fps average in a game, chances are its gonna drop significantly below that, which is bad. I agree that a 30fps minimum would probably be livable, although I generally try to keep the min. (in heavy situations) at around 40 (in UT2k3 for example.


Stop asking people "whats up" or "hows it goin", you know you don't give a sh*t. And noone cares how you're doing either.

"We are far less than we knew." - Bright Eyes
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
I was talking about the average framerates. I'm not saying 30 or 40 is the best by any means. But I've played and beaten entire games with an average framerate below 30FPS on the most difficult settings. Now for multiplayer gameing, where you can't afford taking any chances of getting a low framerate and allowing your opponents to have the upperhand, you pretty much need a 60FPS average in order to cut it. In multiplayer there is not reload button. But in singleplayer, your opponents are much dumber and you can get by on way less. I think I've beaten more games on less than 40 FPS Average than I Have on 60FPS average & higher. When I was quoting the Benchmarks for the old Voodoo cards, people back in 1996 would sell there souls to the devil to get a 30-40 FPS average in Tomb Raider or Quake @ 640x480 with a 4MB Voodoo card. If I played games with framerates far less than the "Bottom End for playability" how the H3CK did I beat them? Its not like I'm some Jedi & the Force guided my every move did it :wink: ? No it had to be playable, or could have not beaten it under any circumstance whatsoever. That's why I consider 30 FPS the bare minimum average for single player First Person Shooters because it can be done, by all means, it can be done. I've beaten entire games with averages less than Half of your "barely survivable" 60 FPS average. Also I consider at least 45-60 FPS Average the bare minimum for multiplayer, depending on the game.

Before I leave this post, I will have to admit that the framerates I received ont hose gamers were very far from perfect, but to me perfect does not equal the ablsolutely barly survivable minimum for playability.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

goblinking

Distinguished
Jun 11, 2003
110
0
18,680
It's often considered best to have a framerate of double your refresh rate. That way you won't get any dropped frames, and your actual displayed framerate is maximized.
 

mopeygoth

Distinguished
Aug 1, 2003
765
6
18,985
I have to agree with you here, back in the days I used to play high high end games on a pii/gf2pro. People went like "how can you play that on that machine?!" I did... and I won... even in the multiplayergames where my friends had twice the pc-power. It is all a matter of getting used to it, you will learn to predict where and how to move and where to shoot if you know you machines weakness and in which manner it lags. No headaches on my behalf... coffee syndrome!

Intel Pentium 4 - 400Mhz FSB - 2.4Ghz
- 1024mb pc-133 - Ati Radeon 9700@350mhz 128Mb@610mhz
 

cleeve

Illustrious
C'mon guys, this is what I said:

"Because of this reason, hardcore first-person-shooter junkies consider an average framerate of 60fps or higher to be the bottom end of "playable", because actual framerates may lower alot from there."

Note that I said both "Hardcore" and "Junkies" to denote that I was talking about elite twitch players.

I never said that games are unplayable below that. Hell, I think my 386 musta played DOOM at 12 fps if I was lucky, and I played through that just fine.

Just making a point is all.

------------------
Radeon 9500 (modded to PRO w/8 pixel pipelines)
AMD AthlonXP 2400+ (O/C to 2600+ with 143 fsb)
3dMark03: 3586
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
i can tell the difference between 60fps and 85fpz very easily. i couldnt always, but now i can and it makes a hugee difference

-------


................
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
For CRT monitors I think higher FPS is better because even though we can't distinguish more than about 24 fps


dude if you sat a 8 year old down and showed him a game running at 24fps, and then immediately afterwards at 60fps he would be able to tell you which one looked "smoother"

-------


................
 

cdpage

Distinguished
Aug 4, 2001
789
0
18,990
found this, answers some questions but not others... has some good points though.

<A HREF="http://www.100fps.com/how_many_frames_can_humans_see.htm" target="_new">http://www.100fps.com/how_many_frames_can_humans_see.htm</A>



ASUS P4S8X - P4 2.4B - 2 x 512M DDR333 - ATI 9500 Pro(Sapphire) - WD 80G HD (8M Buffer) - SAMSUNG SV0844D 8G HD - LG 16X DVD - Yamaha F1 CDRW - Iomega Zip 250 int.
<P ID="edit"><FONT SIZE=-1><EM>Edited by cdpage on 10/06/03 02:49 PM.</EM></FONT></P>
 

flamethrower205

Illustrious
Jun 26, 2001
13,105
0
40,780
You bring a good point- it also depends on a specific human's eyes. Some are more senstive to flicker and pick it up while others wont. Also depends on how tired you are btw. For example, I see flicker at 85Hz, 100Hz is almost perfect, and 120 is just right. People tell me I'm crazy lol. Same thing happens to me with DLP projectors- very very rare that you will see the flickering colors b/c the color wheel, but I see as I watch it.
In CS, I notice the difference between 100fps and 75-60 fps for some reason. Whenever it hits 60-75, it doesn't feel as smooth, and lo and behold, I look down at my fps meter, and it's 60 fps!

The one and only "Monstrous BULLgarian!"
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Nice find Cdpage, it was kind of breif but it covred alot of topics pretty quickly.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

OzzieBloke

Distinguished
Mar 29, 2001
167
0
18,680
You also have to be careful when defining how sensitive the eye is as to whether there is overlap in the images being displayed or not; the human eye is very good at picking up discreet images as opposed to continuous ones.
Example: a monitor, even at 120 Hz vertical refresh, with a graphics card able to put out just as many frames, will seem quite smooth in a game, particularly FPS, since waling along a corridor or even turning quickly around still hase parts of the image overlapping from frame to frame; now take your mouse on the desktop and move it around in a circle very quickly. Even at a vertical refresh of 120 Hz, (and this is just 2D, any modern card can do this), you will still be able to see discreet images of the mouse pointer because the moving image is small, thus the frames do not overlap. Under such conditions, the human eye is quite capable of detecting over 200 discreet frames per second, and likely more.
Based on the discreet firing rate of a single cone in the eye, you theoretically should be able to detect around 400 FPS of *discreet* images, but because the rods and cones of the eye only synchronise on similar light input on a certain part of the retina, and not for different hues, the likely detectable rate would be lower; however, since the CRT is firing photons directly at the eye and is not relying on reflected light like the rest of objects we percieve, we are able to notice these variances in refresh far more. Hence, higher frame rates, particularly for small moving objects, will continue to look better the fast they get, even going beyond 200 fps (assuming vertical refresh of your monitor could mannage 200Hz... and that might be a bit much to ask... motion blur would overcome this by fooling the eye)

-

I plugged my ram into my motherboard, but unplugged it when I smelled cooked mutton.
 

Willamette_sucks

Distinguished
Feb 24, 2002
1,940
0
19,780
Interesting idea, but guess what? Watch the mouse pointer move around at 60hz, then 200hz. No difference. The refresh rate on the mouse pointer is NOT limited by the monitor! It's the input, or the mouse itself.

Too bad, you lose. Go to hell.


Stop asking people "whats up" or "hows it goin", you know you don't give a sh*t. And noone cares how you're doing either.

"We are far less than we knew." - Bright Eyes
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
That was an interesting experiment, moving the cursor in 1-inch circles as fast as I can move it.

At 60hz I see about 12 discreet images. At 100hz I see more like 20 to 22.

At first that seemed like a silly experiment but then I tried moving left to right, edge to edge, back and forth as fast as my mouse would permit. I can see about 6 distinct images at 60hz and about 10 at 100hz.

Now it occurs to me, if you imagine that the mouse cursor is an aiming reticle in an FPS then it's going to be easier to aim at the higher refresh rate (especially if you are trying to move it very quickly). Similarly moving objects would be easier to track at the higher rates as well.

Is this the conclusion I am supposed to draw from your claims?



<b>56K, slow and steady does not win the race on internet!</b>