Hey guys,
I recently learned the causes of screen tearing and stutter in games. I understand tearing occurs when the frames go over the refresh rate, stutter happens when it goes under refresh rate when v sync is on. So g sync and free sync eliminates these.
BUT what if you have a GPU thats the exact right amount of horsepower so that it doesnt feed the display more frames than the refresh rate, but still is at least 60fps. Then technically wouldn't you not need adaptive sync technology like free/g sync? Your frames shouldn't fluctuate that much, between 60 and 120 but say all your components are balanced and youre getting the minimal fluctuation possible.
for example, ive an RTX 2080 TI and I easily get 140 frames with v sync off in a game like FarCry 5 on Ultra at 1080p. So if I have a 120hz monitor, the fps exceeds 120 and Ill have tearing. And if I enable v sync Ill get stutter.
So what if say, I get a GPU a little less powerful, one that can still do ultra but with slightly less frames, say 115 max instead of 140. Since it doesnt go over 120, I shouldnt have tearing and thus dont need v sync, so wont have stutter either. Or do I still need adaptive sync technology regardless?
I suppose to sum up my question: is g/free sync still needed even though you don't go over your refresh rate? It was designed to prevent tear and stutter but even if you dont go over your refresh does it just make overall gameplay smoother?
I ordered the samsung RU8000 TV and its one of the first TVs with freesync (currently all freesync TV's are only Samsung) and its 120hz. Currently no TV is over 120hz excluding BFGD.
So Im nervous as to how my RTX card will perform on it, and I even ordered the radeon VII to compare.
Only 12 freesync displays were certified by nvidia as g sync compatible, and i doubt my new TV would be one of them.
Also, playing in 4K is out of the question since currently HDMI 2.0 (obviously TV only has HDMI) only supports 4K @ 60hz, and most games run @ 4K with RTX 2080 TI will easily exceed 60fps. Everyone tells me to just get a computer monitor but I just love the immersion of a TV. Also, I love using my gamepad. If freesync and g sync are not necessary as long as I stay under the fresh rate, I may even consider a projector.
I saw reviews of the radeon VII and it seems to play most games @ 1080p with a 60fps minimum. So if it can max out all my games @ 1080p and freesync really works, I may end up replacing my RTX 2080 TI. I suppose the major lesson here is, more power isnt always better. Its all about stability depending on what display youre using.
I recently learned the causes of screen tearing and stutter in games. I understand tearing occurs when the frames go over the refresh rate, stutter happens when it goes under refresh rate when v sync is on. So g sync and free sync eliminates these.
BUT what if you have a GPU thats the exact right amount of horsepower so that it doesnt feed the display more frames than the refresh rate, but still is at least 60fps. Then technically wouldn't you not need adaptive sync technology like free/g sync? Your frames shouldn't fluctuate that much, between 60 and 120 but say all your components are balanced and youre getting the minimal fluctuation possible.
for example, ive an RTX 2080 TI and I easily get 140 frames with v sync off in a game like FarCry 5 on Ultra at 1080p. So if I have a 120hz monitor, the fps exceeds 120 and Ill have tearing. And if I enable v sync Ill get stutter.
So what if say, I get a GPU a little less powerful, one that can still do ultra but with slightly less frames, say 115 max instead of 140. Since it doesnt go over 120, I shouldnt have tearing and thus dont need v sync, so wont have stutter either. Or do I still need adaptive sync technology regardless?
I suppose to sum up my question: is g/free sync still needed even though you don't go over your refresh rate? It was designed to prevent tear and stutter but even if you dont go over your refresh does it just make overall gameplay smoother?
I ordered the samsung RU8000 TV and its one of the first TVs with freesync (currently all freesync TV's are only Samsung) and its 120hz. Currently no TV is over 120hz excluding BFGD.
So Im nervous as to how my RTX card will perform on it, and I even ordered the radeon VII to compare.
Only 12 freesync displays were certified by nvidia as g sync compatible, and i doubt my new TV would be one of them.
Also, playing in 4K is out of the question since currently HDMI 2.0 (obviously TV only has HDMI) only supports 4K @ 60hz, and most games run @ 4K with RTX 2080 TI will easily exceed 60fps. Everyone tells me to just get a computer monitor but I just love the immersion of a TV. Also, I love using my gamepad. If freesync and g sync are not necessary as long as I stay under the fresh rate, I may even consider a projector.
I saw reviews of the radeon VII and it seems to play most games @ 1080p with a 60fps minimum. So if it can max out all my games @ 1080p and freesync really works, I may end up replacing my RTX 2080 TI. I suppose the major lesson here is, more power isnt always better. Its all about stability depending on what display youre using.