Ok, many of us have an obsession with fps but I struggle to get my head around the whole thing. I mean, Ive seen people worried that their 250fps system is now only playing at 150fps etc (just an example.). Obviously there is an issue with really low fps,(like well under 60,) but ive read the human eye can only detect somewhere between 30-60 fps in most scenarios so why the concern if your system goes from 250 to 150? Also, isnt it ultimately determined by your monitor, eg 60hz, 144hz and 240hz? (which unless im wrong, it also corresponds to what refresh rate per second these monitors can run at, eg 60 fps 144fps 240fps?) So if a GPU is running at 150 fps but on a 60hz monitor, isnt it only truly running at 60fps? Can someone point me to an article or link that explains all this really well please.